I've noticed both on 2.9.37 and 3.0.2 that when bootstrapping Juju on microk8s I see a pod restart of the controller-0 pod. Here's the output of `get pods` immediately after a bootstrap in Juju 3.0.2:
```
[...]
Bootstrap complete, controller "microk8s-localhost" is now available in namespace "controller-microk8s-localhost"
Now you can run
juju add-model <model-name>
to create a new model to deploy k8s workloads.
mthaddon@finistere:~$ microk8s kubectl get pods --all-namespaces
NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system calico-node-txnvf 1/1 Running 0 7m22s
kube-system coredns-d489fb88-rcrkm 1/1 Running 0 6m36s
kube-system calico-kube-controllers-dfbcc8b44-nd98s 1/1 Running 0 7m22s
kube-system hostpath-provisioner-766849dd9d-c968j 1/1 Running 0 5m6s
ingress nginx-ingress-microk8s-controller-6kjm2 1/1 Running 0 5m6s
container-registry registry-6674bf676f-tdtbh 1/1 Running 0 5m6s
controller-microk8s-localhost controller-0 3/3 Running 1 (41s ago) 4m54s
controller-microk8s-localhost modeloperator-66fc59d5ff-jcr69 1/1 Running 0 24s
```
Looking at the logs in 3.0.2 I see the following:
```
2023-01-17T08:28:16.793Z [pebble] HTTP API server listening on ":38812".
2023-01-17T08:28:16.793Z [pebble] Started daemon.
2023-01-17T08:28:16.796Z [pebble] POST /v1/services 3.011123ms 202
2023-01-17T08:28:16.797Z [pebble] Started default services with change 1.
2023-01-17T08:28:16.798Z [pebble] Service "container-agent" starting: /charm/bin/containeragent unit --data-dir /var/lib/juju --append-env "PATH=$PATH:/charm/bin" --show-log --charm-modified-version 0 --controller
2023-01-17T08:28:16.820Z [container-agent] 2023-01-17 08:28:16 INFO juju.cmd supercommand.go:56 running containerAgent [3.0.2 8bf53dc35b25145ef39051fe4136135a3dd53d5d gc go1.19.3]
2023-01-17T08:28:16.820Z [container-agent] starting containeragent unit command
2023-01-17T08:28:16.820Z [container-agent] containeragent unit "unit-controller-0" start (3.0.2 [gc])
2023-01-17T08:28:16.820Z [container-agent] 2023-01-17 08:28:16 INFO juju.cmd.containeragent.unit runner.go:556 start "unit"
2023-01-17T08:28:16.820Z [container-agent] 2023-01-17 08:28:16 INFO juju.worker.upgradesteps worker.go:60 upgrade steps for 3.0.2 have already been run.
2023-01-17T08:28:16.821Z [container-agent] 2023-01-17 08:28:16 INFO juju.worker.probehttpserver server.go:157 starting http server on 127.0.0.1:65301
2023-01-17T08:28:17.232Z [container-agent] 2023-01-17 08:28:17 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:21.267Z [container-agent] 2023-01-17 08:28:21 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:25.933Z [container-agent] 2023-01-17 08:28:25 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:26.797Z [pebble] Check "readiness" failure 1 (threshold 3): received non-20x status code 404
2023-01-17T08:28:26.797Z [pebble] Check "liveness" failure 1 (threshold 3): received non-20x status code 404
2023-01-17T08:28:31.018Z [container-agent] 2023-01-17 08:28:31 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:36.799Z [pebble] Check "readiness" failure 2 (threshold 3): received non-20x status code 404
2023-01-17T08:28:36.799Z [pebble] Check "liveness" failure 2 (threshold 3): received non-20x status code 404
2023-01-17T08:28:37.157Z [container-agent] 2023-01-17 08:28:37 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:45.721Z [container-agent] 2023-01-17 08:28:45 ERROR juju.worker.dependency engine.go:695 "api-caller" manifold worker returned unexpected error: [3ef264] "unit-controller-0" cannot open api: unable to connect to API: dial tcp 127.0.0.1:17070: connect: connection refused
2023-01-17T08:28:46.796Z [pebble] Check "readiness" failure 3 (threshold 3): received non-20x status code 404
2023-01-17T08:28:46.796Z [pebble] Check "readiness" failure threshold 3 hit, triggering action
2023-01-17T08:28:46.796Z [pebble] Check "liveness" failure 3 (threshold 3): received non-20x status code 404
2023-01-17T08:28:46.796Z [pebble] Check "liveness" failure threshold 3 hit, triggering action
2023-01-17T08:28:46.796Z [pebble] Service "container-agent" on-check-failure action is "restart", terminating process before restarting
2023-01-17T08:28:46.801Z [pebble] Service "container-agent" exited after check failure, restarting
2023-01-17T08:28:46.801Z [pebble] Service "container-agent" on-check-failure action is "restart", waiting ~500ms before restart (backoff 1)
2023-01-17T08:28:47.340Z [pebble] Service "container-agent" starting: /charm/bin/containeragent unit --data-dir /var/lib/juju --append-env "PATH=$PATH:/charm/bin" --show-log --charm-modified-version 0 --controller
2023-01-17T08:28:47.368Z [container-agent] 2023-01-17 08:28:47 INFO juju.cmd supercommand.go:56 running containerAgent [3.0.2 8bf53dc35b25145ef39051fe4136135a3dd53d5d gc go1.19.3]
2023-01-17T08:28:47.368Z [container-agent] starting containeragent unit command
2023-01-17T08:28:47.368Z [container-agent] containeragent unit "unit-controller-0" start (3.0.2 [gc])
2023-01-17T08:28:47.368Z [container-agent] 2023-01-17 08:28:47 INFO juju.cmd.containeragent.unit runner.go:556 start "unit"
```
This seems like a timing / order of startup issue with the juju workers and how the probes have been configured. Since everything does come good, it's more annoying than a critical issue, but one we should address for sure.