GithubHelp home page GithubHelp logo

mongodb-k8s-operator's Introduction

Charmed MongoDB K8s Documentation

Overview

MongoDB is a popular NoSQL database application. It stores its data with JSON-like documents creating a flexible user experience with easy-to-use data aggregation for data analytics. In addition, it is a distributed database, so vertical and horizontal scaling come naturally.

Applications like MongoDB must be managed and operated in production environments. This means that MongoDB application administrators and analysts who run workloads in various infrastructures should be able to automate tasks for repeatable operational work. Technologies such as software operators encapsulate the knowledge, wisdom and expertise of a real-world operations team and codify it into a computer program that helps to operate complex server applications like MongoDB and other databases. Canonical has developed an open-source operator called Charmed MongoDB for this purpose.

The Charmed MongoDB Kubernetes (K8s) operator deploys and operates MongoDB in multi-cloud environments using Kubernetes. Software operators are principally associated with Kubernetes, an open-source container orchestration system that facilitates the deployment and operation of complex server applications. As a concept, however, they can be applied equally to application and infrastructure operations on platforms beyond Kubernetes. They are especially popular on Kubernetes because they can help to reduce the complexity of operations on this powerful but complex platform.

Charmed MongoDB (K8s Operator) is an enhanced, open source and fully-compatible drop-in replacement for the MongoDB Community Edition with advanced MongoDB enterprise features. It simplifies the deployment, scaling, design and management of MongoDB in production in a reliable way. In addition, you can use the operator to manage your MongoDB clusters with automation capabilities. It also offers replication, TLS, password rotation, easy-to-use application integration, backup, restore, and monitoring.

Aside from a Kubernetes operator, Canonical developed another operator called Charmed MongoDB (VM operator) that operates on physical, Virtual Machines (VM) and a wide range of cloud and cloud-like environments, including AWS, Azure, OpenStack and VMWare.

Software and releases

Charmed MongoDB (K8s Operator) uses the Charmed MongoDB Rock which is reliant on the Charmed MongoDB snap package, which offers more features than the MongoDB Community version, such as backup and restores, monitoring and security features.

To see the Charmed MongoDB features and releases, visit our Github Releases and Release Notes page.

Charm version, environment and OS

A charm version is a combination of both the application version and / (slash) the channel, e.g. 6/edge and 5/edge.

5/edge Charmhub - 5/edge GitHub 6/edge Charmhub - 6/edge GitHub

You can deploy the charm a stand-alone machine or cloud and cloud-like environments, including AWS, Azure, OpenStack and VMWare.

The upper portion of this page describes the Operating System (OS) where the charm can run. For example, 6/stable is compatible and should run on a machine with Ubuntu 22.04 OS.

Supported architectures

The charm should be running on architectures that support AVX

To check your architecture please execute

grep avx /proc/cpuinfo

or

grep avx2 /proc/cpuinfo

Supported Juju versions

At the current moment the MongoDB K8S charm supports and Juju 3.x

Security, Bugs and feature request

If you find a bug in this operator or want to request a specific feature, here are the useful links:

Contributing

Please see the Juju SDK docs for guidelines on enhancements to this charm following best practice guidelines, and CONTRIBUTING.md for developer guidance.

Licence statement

The Charmed MongoDB Operator is free software, distributed under the Apache Software License, version 2.0. It installs and operates Percona Server for MongoDB, which is licensed under the Server Side Public License (SSPL) version 1.

Trademark Notice

“MongoDB” is a trademark or registered trademark of MongoDB, Inc. Other trademarks are property of their respective owners. Charmed MongoDB is not sponsored, endorsed, or affiliated with MongoDB, Inc.

mongodb-k8s-operator's People

Contributors

carlcsaposs-canonical avatar delgod avatar dmitry-ratushnyy avatar dragomirp avatar github-actions[bot] avatar jardon avatar juditnovak avatar mehdi-bendriss avatar merkata avatar miaaltieri avatar renovate[bot] avatar taurus-forever avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mongodb-k8s-operator's Issues

Tests in HA occassionally fail to retrieve primary

Bug Description

Tests in HA occasssionally fail to retrieve primary because the non-reachable unit can be chosen when creating the replica set client.

Two ways to fix this:

  1. use all units when creating MongoDB client
  2. passing an "excluded_unit" to replica_set_primary which will pass to get_mongodb_client

To Reproduce

run HA tests

Environment

environment set up by CI

Relevant log output

https://github.com/canonical/mongodb-k8s-operator/actions/runs/3639102920/jobs/6142270318

Additional context

No response

Mongo unit error state on teardown

Steps to reproduce

Destroy a model with multiple applications relationed to mongodb-k8s.

Expected behavior

Model tears down cleanly

Actual behavior

Mongo charm goes into error state. The following is observed in the juju debug-log:

Versions

Operating system: Ubuntu 22.04.3 LTS
Juju CLI: 3.1.7-genericlinux-amd64
Juju agent: 3.1.7
Charm revision: 5/edge 36
microk8s: MicroK8s v1.27.7 revision 6101

Log output

Juju debug log:
log.txt

unit-mongodb-k8s-0: 20:46:42 INFO unit.mongodb-k8s/0.juju-log database:15: Update relation user: relation-7 on free5gc
unit-mongodb-k8s-0: 20:46:42 INFO unit.mongodb-k8s/0.juju-log database:15: Updating relation data according to diff
unit-mongodb-k8s-0: 20:46:42 ERROR unit.mongodb-k8s/0.juju-log database:15: Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/./src/charm.py", line 1218, in <module>
    main(MongoDBCharm)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 441, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 149, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 344, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 833, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 922, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/mongodb/v0/mongodb_provider.py", line 97, in _on_relation_event
    self.oversee_users(departed_relation_id, event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/mongodb/v0/mongodb_provider.py", line 149, in oversee_users
    self._diff(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/mongodb/v0/mongodb_provider.py", line 175, in _diff
    key: value for key, value in event.relation.data[event.app].items() if key != "data"
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/model.py", line 1361, in __getitem__
    raise KeyError(
KeyError: 'Cannot index relation data with "None". Are you trying to access remote app data during a relation-broken event? This is not allowed.'
unit-mongodb-k8s-0: 20:46:42 ERROR juju.worker.uniter.operation hook "database-relation-broken" (via hook dispatching script: dispatch) failed: exit status 1

Transient error in querying server information

Issue

MongoDB charm can produce transient errors when querying the server (MongoDB) status. This is because of an untraped exception that was observed when scaling up. Though this does not interfere with the scaling and creation of the replica set, it never the less is visible transiently in Juju status.

unit-mongodb-k8s-0: 16:46:50 ERROR unit.mongodb-k8s/0.juju-log mongodb:0: Uncaught exception while in charm code:
Traceback (most recent call last):
  File "./src/charm.py", line 368, in <module>
    main(MongoDBCharm, use_juju_for_storage=True)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 414, in main
    charm = charm_class(framework)
  File "./src/charm.py", line 60, in __init__
    if self._stored.mongodb_initialized and self.mongo.version:
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/src/mongoserver.py", line 255, in version
    info = client.server_info()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1994, in server_info
    return self.admin.command("buildinfo",
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/database.py", line 759, in command
    return self._command(sock_info, command, secondary_ok, value,
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/database.py", line 641, in _command
    return sock_info.command(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 710, in command
    return command(self, dbname, spec, secondary_ok,
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/network.py", line 161, in command
    helpers._check_command_response(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/helpers.py", line 167, in _check_command_response
    raise OperationFailure(errmsg, code, response, max_wire_version)
pymongo.errors.OperationFailure: Cache Reader No keys found for HMAC that is valid for time: { ts: Timestamp(1643301977, 1) } with id: 7057927948019433476, full error: {'ok': 0.0, 'errmsg': 'Cache Reader No keys found for HMAC that is valid for time: { ts: Timestamp(1643301977, 1) } with id: 7057927948019433476', 'code': 211, 'codeName': 'KeyNotFound'}
unit-mongodb-k8s-0: 16:46:50 ERROR juju.worker.uniter.operation hook "mongodb-relation-changed" (via hook dispatching script: dispatch) failed: exit status 1

HA-tests of 5/edge are failing

Steps to reproduce

Expected behavior

Actual behavior

Versions

Operating system:

Juju CLI:

Juju agent:

Charm revision:

microk8s:

Log output

Juju debug log:

Additional context

Reference to TLS charm + Typo

In the following page, there is a link that refers to the previously used TLS charm instead of the new self-signed-certs charm even though the tutorial is using that.
Also, the TLS header has a typo : "Transcript Layer Security (TLS)" should become "Transport Layer Security (TLS)"

Rename Repo

Per the naming guidelines recently published, this repo should be renamed mongodb-k8s-operator. Will need a ticket into IS.

Update to PyMongo 4 and MongoDB 5

Issue

The manner in which replica sets are initialized has changed subtly in PyMongo 4 and MongoDB 5. As a result the existing MongoDB charm is not compatible with these latest releases. An example of MongoDb log files illustrating the problem is attached.
mongo-edge.log

Mongodb-k8s goes in error when restoring from backup after reinstallation

Steps to reproduce

  1. juju add-model test
  2. juju consume s3-integrator # deployed in another model
  3. juju deploy mongodb-k8s --channel 5/edge
  4. juju relate mongodb-k8s s3-integrator
  5. juju run mongodb-k8s/0 create-backup
  6. juju run mongodb-k8s/0 list-backups # wait until the backup is complete
  7. juju remove-application mongodb-k8s
  8. juju remove-storage mongodb/0
  9. juju deploy mongodb-k8s --channel 5/edge
  10. juju relate mongodb-k8s s3-integrator
  11. juju run mongodb-k8s/0 list-backups # Confirm backup is there
  12. juju run mongodb-k8s/0 restore backup-id=<backup_id>

Expected behavior

Mongo finishes restoring properly and unit goes in active/idle status.

Actual behavior

Mongo goes into an error state with a stack trace in the logs:

unit-mongodb-k8s-0: 19:56:40 INFO unit.mongodb-k8s/0.juju-log Deferring reconfigure: error=OperationFailure("Authentication failed., full error: {'ok': 0.0, 'errmsg': 'Authentication failed.', 'code': 18, 'codeName': 'AuthenticationFailed', '$clusterTime': {'clusterTime': Timestamp(1692921400, 1), 'signature': {'hash': b'\\xd6]\\xd4(\\xa4\\xe7M\\x0c=v\\xf1\\x08q\\x94\\x1b\\xeb\\x91\\xf0n\\xe8', 'keyId': 7271039350459072517}}, 'operationTime': Timestamp(1692921400, 1)}")
unit-mongodb-k8s-0: 19:56:40 ERROR unit.mongodb-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/./src/charm.py", line 1218, in <module>
    main(MongoDBCharm)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 441, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 149, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 344, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 833, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 922, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/./src/charm.py", line 479, in _on_update_status
    mongodb_status = build_unit_status(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/mongodb/v0/helpers.py", line 168, in build_unit_status
    replset_status = mongo.get_replset_status()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/mongodb/v0/mongodb.py", line 191, in get_replset_status
    rs_status = self.client.admin.command("replSetGetStatus")
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/_csot.py", line 105, in csot_wrapper
    return func(self, *args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/database.py", line 805, in command
    with self.__client._socket_for_reads(read_preference, session) as (
  File "/usr/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1282, in _socket_from_server
    with self._get_socket(server, session) as sock_info:
  File "/usr/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1217, in _get_socket
    with server.get_socket(handler=err_handler) as sock_info:
  File "/usr/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1407, in get_socket
    sock_info = self._get_socket(handler=handler)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1520, in _get_socket
    sock_info = self.connect(handler=handler)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1378, in connect
    sock_info.authenticate()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 870, in authenticate
    auth.authenticate(creds, self)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 549, in authenticate
    auth_func(credentials, sock_info)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 473, in _authenticate_default
    return _authenticate_scram(credentials, sock_info, "SCRAM-SHA-256")
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 241, in _authenticate_scram
    res = sock_info.command(source, cmd)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 767, in command
    return command(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/network.py", line 166, in command
    helpers._check_command_response(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/helpers.py", line 181, in _check_command_response
    raise OperationFailure(errmsg, code, response, max_wire_version)
pymongo.errors.OperationFailure: Authentication failed., full error: {'ok': 0.0, 'errmsg': 'Authentication failed.', 'code': 18, 'codeName': 'AuthenticationFailed', '$clusterTime': {'clusterTime': Timestamp(1692921400, 1), 'signature': {'hash': b'\xd6]\xd4(\xa4\xe7M\x0c=v\xf1\x08q\x94\x1b\xeb\x91\xf0n\xe8', 'keyId': 7271039350459072517}}, 'operationTime': Timestamp(1692921400, 1)}

Versions

Operating system: Ubuntu 23.04

Juju CLI: 3.1.5-genericlinux-amd64

Juju agent: 3.1.5

Charm revision: 36

microk8s: MicroK8s v1.27.4 revision 5657

Log output

Juju debug log:

Additional context

MongoDB <-> Grafana Agent integration doesn't work behind the proxy

Steps to reproduce

  1. Get an environment hidden behind proxy
  2. Deploy grafana-agent-k8s
  3. Deploy mongodb-k8s from 5/edge channel
  4. Integrate both applications: juju integrate mongodb-k8s:logging grafana-agent-k8s:logging-provider

Expected behavior

Charms are integrated and go to Active-Idle state

Actual behavior

Traceback (most recent call last):
  File "/usr/lib/python3.10/urllib/request.py", line 1348, in do_open
    h.request(req.get_method(), req.selector, req.data, headers,
  File "/usr/lib/python3.10/http/client.py", line 1283, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.10/http/client.py", line 1329, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.10/http/client.py", line 1278, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.10/http/client.py", line 1038, in _send_output
    self.send(msg)
  File "/usr/lib/python3.10/http/client.py", line 976, in send
    self.connect()
  File "/usr/lib/python3.10/http/client.py", line 1448, in connect
    super().connect()
  File "/usr/lib/python3.10/http/client.py", line 942, in connect
    self.sock = self._create_connection(
  File "/usr/lib/python3.10/socket.py", line 845, in create_connection
    raise err
  File "/usr/lib/python3.10/socket.py", line 833, in create_connection
    sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/./src/charm.py", line 1218, in <module>
    main(MongoDBCharm)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 441, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 149, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 344, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 833, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 922, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/loki_k8s/v0/loki_push_api.py", line 1803, in _on_pebble_ready
    self._setup_promtail()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/loki_k8s/v0/loki_push_api.py", line 2272, in _setup_promtail
    self._obtain_promtail(promtail_binaries[self._arch])
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/loki_k8s/v0/loki_push_api.py", line 1971, in _obtain_promtail
    self._download_and_push_promtail_to_workload(promtail_info)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/lib/charms/loki_k8s/v0/loki_push_api.py", line 2093, in _download_and_push_promtail_to_workload
    with request.urlopen(promtail_info["url"]) as r:
  File "/usr/lib/python3.10/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python3.10/urllib/request.py", line 519, in open
    response = self._open(req, data)
  File "/usr/lib/python3.10/urllib/request.py", line 536, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
  File "/usr/lib/python3.10/urllib/request.py", line 496, in _call_chain
    result = func(*args)
  File "/usr/lib/python3.10/urllib/request.py", line 1391, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
  File "/usr/lib/python3.10/urllib/request.py", line 1351, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 110] Connection timed out>

After this exception, charm stays in Maintenance-Executing state forever.

Versions

Operating system: Ubuntu 22.04.2 LTS

Juju CLI: 3.1.5-genericlinux-amd64

Juju agent: 3.1.5

Charm revision: Channel 5/edge, Revision 36

microk8s: MicroK8s v1.27.4 revision 5657

Log output

Juju debug log:

Relevant part presented above

Additional context

When mongodb-k8s is scaling up and down the endpoints and uris are not updated.

Bug Description

When mongodb-k8s is scaling up and down the endpoints and uris are not updated.

To Reproduce

When mongodb-k8s is scaling up and down the endpoints and uris are not updated.

Step to reproduce the issues:

juju deploy mongodb-k8s --channel=edge -n 3
juju deploy data-integrator --channel=edge
juju config data-integrator database-name=pippo
juju relate data-integrator mongodb-k8s

juju run-action data-integrator/0 get-credentials --wait
unit-data-integrator-0:
  UnitId: data-integrator/0
  id: "4"
  results:
    mongodb:
      database: pippo
      endpoints: mongodb-k8s-2.mongodb-k8s-endpoints,mongodb-k8s-1.mongodb-k8s-endpoints,mongodb-k8s-0.mongodb-k8s-endpoints
      password: CFrabjGfvU0QNdzgUXzWLPZaVewZCwpx
      replset: mongodb-k8s
      uris: mongodb://relation-2:CFrabjGfvU0QNdzgUXzWLPZaVewZCwpx@mongodb-k8s-2.mongodb-k8s-endpoints,mongodb-k8s-1.mongodb-k8s-endpoints,mongodb-k8s-0.mongodb-k8s-endpoints/pippo?replicaSet=mongodb-k8s&authSource=admin
      username: relation-2
    ok: "True"
  status: completed
  timing:
    completed: 2023-02-22 11:59:15 +0000 UTC
    enqueued: 2023-02-22 11:59:13 +0000 UTC
    started: 2023-02-22 11:59:14 +0000 UTC

Then I removed a unit:

juju scale-application mongodb-k8s  2
juju status


Model       Controller  Cloud/Region        Version  SLA          Timestamp
test-mongo  k8s         microk8s/localhost  2.9.38   unsupported  13:01:15+01:00

App              Version  Status  Scale  Charm            Channel  Rev  Address         Exposed  Message
data-integrator           active      1  data-integrator  edge       4  10.152.183.215  no       
mongodb-k8s               active      2  mongodb-k8s      edge      18  10.152.183.66   no       

Unit                Workload  Agent  Address       Ports  Message
data-integrator/0*  active    idle   10.1.228.220         
mongodb-k8s/0*      active    idle   10.1.228.252         
mongodb-k8s/1       active    idle   10.1.228.248         
mongodb-k8s/2       unknown   lost   10.1.228.217         agent lost, see 'juju show-status-log mongodb-k8s/2'

Check uris and endpoints:

juju run-action data-integrator/0 get-credentials --wait
unit-data-integrator-0:
  UnitId: data-integrator/0
  id: "8"
  results:
    mongodb:
      database: pippo
      endpoints: mongodb-k8s-2.mongodb-k8s-endpoints,mongodb-k8s-1.mongodb-k8s-endpoints,mongodb-k8s-0.mongodb-k8s-endpoints
      password: CFrabjGfvU0QNdzgUXzWLPZaVewZCwpx
      replset: mongodb-k8s
      uris: mongodb://relation-2:CFrabjGfvU0QNdzgUXzWLPZaVewZCwpx@mongodb-k8s-2.mongodb-k8s-endpoints,mongodb-k8s-1.mongodb-k8s-endpoints,mongodb-k8s-0.mongodb-k8s-endpoints/pippo?replicaSet=mongodb-k8s&authSource=admin
      username: relation-2
    ok: "True"
  status: completed
  timing:
    completed: 2023-02-22 12:01:34 +0000 UTC
    enqueued: 2023-02-22 12:01:31 +0000 UTC
    started: 2023-02-22 12:01:33 +0000 UTC

The unit mongodb-k8s-2 is still in the endpoints and uris

Environment

Juju 2.9.38
microk8s 1.26.1

Relevant log output

Please see reproduction steps.

Additional context

No response

Rename the `mongo` interface to be more specific

See, e.g., mongodb-datasource or something that expresses "I want to QUERY mongo" while leaving naming space for other relations powered by mongo, like a hypothetical "mongo-administration" if someone goes and charms Mongo OpsManager.

MongoDB charm fails to reconfigure replica set when leader unit departs

Issue

When the leader unit in a replica set deployed by the MongoDB charm departs the replica set fails to reconfigure it self.

Steps to reproduce

  1. Deploy 7 or more units of MongoDB charm from edge channel juju deploy mongodb-k8s --channel=edge -n 7 and wait for replica set to be configured
  2. Check that none of the first three units (0-2) are leader units. If they are remove the MongoDB application and retry step 1.
  3. Scale the MongoDB application to 3 units juju scale-application mongodb-k8s 3

Expected outcome

The MongoDB replica set is reconfigured as a 3 unit replica set

Actual outcome

The 3 units of the MongoDB replica set fail to reconfigure because the leader units knowlege of the state of the cluster is out of sync with its actual state.

Failed to get pbm status errors

Steps to reproduce

  1. juju deploy mongodb-k8s
  2. juju deploy sdcore-nrf
  3. juju integrate mongodb-k8s sdcore-nrf

Expected behavior

No errors

Actual behavior

Failed to get pbm status errors in charm.

Versions

Operating system: Ubuntu 22.04
Juju CLI: 3.1.6
Juju agent: 3.1.6
Charm revision: 6/edge (Revision 37)
microk8s: v1.27.6

Log output

unit-mongodb-k8s-0: 07:37:32 ERROR unit.mongodb-k8s/0.juju-log Failed to get pbm status. non-zero exit code 1 executing ['/usr/bin/pbm', 'status', '-o', 'json'], stdout='{"Error":"get status of pitr: check for errors: get current epoch: get config: get: mongo: no documents in result"}\n', stderr=''

Please add documentation to relation library

It would be very useful to have some documentation on how to use https://github.com/canonical/mongodb-k8s-operator/blob/main/lib/charms/mongodb_k8s/v0/mongodb.py.

If you add a multiline docstring at the top of this file it will be rendered as documentation on charmhub, making it easier for charm authors to integrate with your charm. As an example see https://charmhub.io/nginx-ingress-integrator/libraries/ingress and the corresponding documentation in https://github.com/canonical/nginx-ingress-integrator-operator/blob/main/lib/charms/nginx_ingress_integrator/v0/ingress.py.

Mongo not able to start

Steps to reproduce

  1. sudo snap install microk8s --classic --channel=1.26
  2. sudo snap install juju --classic --channel=2.9/stable
  3. microk8s.enable hostpath-storage
  4. juju bootstrap microk8s
  5. juju deploy ch:mongodb-k8s -m test --channel 5/edge --storage mongodb=50M

Expected behavior

Mongo is usable after deployment.

Actual behavior

Charm shows mongo as usable, but the following errors are seen in the log, and the pod does not respond to port 27017 externally.

root@mongodb-k8s-0:/# nc -zvw1 mongodb-k8s 27017
nc: connect to mongodb-k8s (10.152.183.20) port 27017 (tcp) timed out: Operation now in progress
root@mongodb-k8s-0:/# nc -zvw1 127.0.0.1 27017
Connection to 127.0.0.1 27017 port [tcp/*] succeeded!

Versions

Operating system: Ubuntu 22.04
Juju CLI: 2.9
Juju agent: 2.9.45
Charm revision: mongodb-k8s 5/edge 36
microk8s: v1.26.10 revision 6102

Log output

2023-11-23T14:52:50.903577443Z 2023-11-23T14:52:50.903Z [pbm-agent] 2023-11-23T14:52:50.000+0000 W [agentCheckup] get current storage status: query mongo: mongo: no documents in result
2023-11-23T14:52:50.905493118Z 2023-11-23T14:52:50.905Z [pbm-agent] 2023-11-23T14:52:50.000+0000 E [agentCheckup] check storage connection: unable to get storage: get config: get: mongo: no documents in result
2023-11-23T14:52:55.905212611Z 2023-11-23T14:52:55.905Z [pbm-agent] 2023-11-23T14:52:55.000+0000 E [agentCheckup] check storage connection: unable to get storage: get config: get: mongo: no documents in result

Additional context

This charm has been used in OSM for a long time, but unfortunately just recently stopped working in all releases (v10.0, v12.0, v14.0). The community is going to be replacing Mongo Charm with a Helm chart as they no longer trust charms.

Charm CI/CD publishing is failing due to focal base and `--destructive-mode` in charming-actions/upload-charm

Bug Description

Hi,

Please check https://github.com/canonical/mongodb-k8s-operator/actions/runs/3995207354/jobs/6854407149

/usr/bin/sudo snap install charmcraft --classic --channel latest/stable
charmcraft 2.2.0 from Canonical** installed

/usr/bin/sudo charmcraft pack --destructive-mode --quiet
No suitable 'build-on' environment found in any 'bases' configuration.
Full execution log: '/root/.local/state/charmcraft/log/charmcraft-20230124-1[10](https://github.com/canonical/mongodb-k8s-operator/actions/runs/3995207354/jobs/6854407149#step:4:11)008.895535.log'
Error: The process '/usr/bin/sudo' failed with exit code 1
Error: Error: The process '/usr/bin/sudo' failed with exit code 1

Options:

  1. pin runner to ubuntu-20.04 (current ubuntu-latest is 22.04)
  2. update charm to be based on jammy
  3. wait for charming-actions/upload-charm to be improved.

@delgod what is your choise?

To Reproduce

Merge something in branch main and check CI/CD: https://github.com/canonical/mongodb-k8s-operator/actions?query=branch%3Amain

It should pass, actual result: it is failing :-D

Environment

The branch main for this repo CI/CD.

Relevant log output

Posted above

Additional context

No response

AttributeError: module 'lib' has no attribute 'OpenSSL_add_all_algorithms'

Bug Description

The operator fails to deploy due to an incompatibility with openssl as discussed in this bug

To Reproduce

tox -e integration

Environment

integration installed: anyio==3.6.2,asttokens==2.2.1,attrs==22.2.0,backcall==0.2.0,bcrypt==4.0.1,black==22.12.0,CacheControl==0.12.11,cachetools==5.3.0,certifi==2022.12.7,cffi==1.15.1,charset-normalizer==3.0.1,cleo==2.0.1,click==8.1.3,codespell==2.2.2,coverage==7.1.0,crashtest==0.4.1,cryptography==39.0.0,decorator==5.1.1,distlib==0.3.6,dnspython==2.3.0,dulwich==0.20.50,exceptiongroup==1.1.0,executing==1.2.0,filelock==3.9.0,flake8==5.0.4,flake8-builtins==2.1.0,flake8-copyright==0.2.3,flake8-docstrings==1.7.0,google-auth==2.16.0,h11==0.14.0,html5lib==1.1,httpcore==0.16.3,httpx==0.23.3,idna==3.4,importlib-metadata==6.0.0,iniconfig==2.0.0,ipdb==0.13.11,ipython==8.8.0,isort==5.11.4,jaraco.classes==3.2.3,jedi==0.18.2,jeepney==0.8.0,Jinja2==3.1.2,jsonschema==4.17.3,juju==2.9.38.1,jujubundlelib==0.5.7,keyring==23.13.1,kubernetes==25.3.0,lightkube==0.12.0,lightkube-models==1.26.0.4,lockfile==0.12.2,macaroonbakery==1.3.1,MarkupSafe==2.1.2,matplotlib-inline==0.1.6,mccabe==0.7.0,more-itertools==9.0.0,msgpack==1.0.4,mypy-extensions==0.4.3,oauthlib==3.2.2,ops==2.0.0,packaging==23.0,paramiko==2.12.0,parso==0.8.3,pathspec==0.11.0,pep8-naming==0.13.3,pexpect==4.8.0,pickleshare==0.7.5,pkginfo==1.9.6,platformdirs==2.6.2,pluggy==1.0.0,poetry==1.3.2,poetry-core==1.4.0,poetry-plugin-export==1.3.0,prompt-toolkit==3.0.36,protobuf==3.20.3,ptyprocess==0.7.0,pure-eval==0.2.2,pyasn1==0.4.8,pyasn1-modules==0.2.8,pycodestyle==2.9.1,pycparser==2.21,pydocstyle==6.3.0,pyflakes==2.5.0,Pygments==2.14.0,pymacaroons==0.13.0,pymongo==4.3.3,PyNaCl==1.5.0,pyproject-flake8==5.0.4.post1,pyRFC3339==1.1,pyrsistent==0.19.3,pytest==7.2.1,pytest-asyncio==0.20.3,pytest-operator==0.23.0,python-dateutil==2.8.2,pytz==2022.7.1,PyYAML==6.0,rapidfuzz==2.13.7,requests==2.28.2,requests-oauthlib==1.3.1,requests-toolbelt==0.10.1,rfc3986==1.5.0,rsa==4.9,SecretStorage==3.3.3,shellingham==1.5.0.post1,six==1.16.0,sniffio==1.3.0,snowballstemmer==2.2.0,stack-data==0.6.2,tenacity==8.1.0,theblues==0.5.2,tomli==2.0.1,tomlkit==0.11.6,toposort==1.9,traitlets==5.8.1,trove-classifiers==2023.2.8,typing-inspect==0.8.0,typing_extensions==4.4.0,urllib3==1.26.14,virtualenv==20.19.0,wcwidth==0.2.6,webencodings==0.5.1,websocket-client==1.4.2,websockets==10.4,zipp==3.13.0
integration run-test-pre: PYTHONHASHSEED='54335413'

Relevant log output

Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/./src/charm.py", line 16, in <module>
    from charms.mongodb.v0.helpers import (
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/lib/charms/mongodb/v0/helpers.py", line 10, in <module>
    from charms.mongodb.v0.mongodb import MongoDBConfiguration, MongoDBConnection
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/lib/charms/mongodb/v0/mongodb.py", line 12, in <module>
    from pymongo import MongoClient
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/__init__.py", line 92, in <module>
    from pymongo.mongo_client import MongoClient
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/mongo_client.py", line 59, in <module>
    from pymongo import (
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/uri_parser.py", line 23, in <module>
    from pymongo.client_options import _parse_ssl_options
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/client_options.py", line 26, in <module>
    from pymongo.pool import PoolOptions
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/pool.py", line 61, in <module>
    from pymongo.network import command, receive_message
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/network.py", line 24, in <module>
    from pymongo import _csot, helpers, message, ssl_support
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/ssl_support.py", line 22, in <module>
    import pymongo.pyopenssl_context as _ssl
  File "/var/lib/juju/agents/unit-mongodb-k8s-2/charm/venv/pymongo/pyopenssl_context.py", line 27, in <module>
    from OpenSSL import SSL as _SSL
  File "/usr/lib/python3/dist-packages/OpenSSL/__init__.py", line 8, in <module>
    from OpenSSL import crypto, SSL
  File "/usr/lib/python3/dist-packages/OpenSSL/crypto.py", line 3279, in <module>
    _lib.OpenSSL_add_all_algorithms()
AttributeError: module 'lib' has no attribute 'OpenSSL_add_all_algorithms'
ERROR juju.worker.uniter.operation hook "db-storage-attached" (via hook dispatching script: dispatch) failed: exit status 1

Additional context

No response

Charm doesn't deploy on GKE

Bug Description

Deploying the charm in Google Cloud's Kubernetes (GKE) doesn't work. The charm goes to an unknown/idle status.

To Reproduce

  1. juju deploy mongodb-k8s --trust --channel=edge

Environment

Juju: 2.9.38
GKE version: 1.24.9-gke.2000

Relevant log output

guillaume@thinkpad:~/PycharmProjects$ juju status
Model  Controller              Cloud/Region            Version  SLA          Timestamp
cn     gke-feb-15-us-central1  gke-feb-15/us-central1  2.9.38   unsupported  16:47:38-05:00

App      Version  Status   Scale  Charm        Channel  Rev  Address     Exposed  Message
mongodb           waiting      1  mongodb-k8s  edge      18  10.48.6.53  no       installing agent

Unit        Workload  Agent  Address     Ports  Message
mongodb/0*  unknown   idle   10.44.1.26         
unit-mongodb-0: 16:45:18 INFO juju.worker.uniter.operation ran "leader-elected" hook (via hook dispatching script: dispatch)
unit-mongodb-0: 16:45:18 ERROR unit.mongodb/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-0/charm/./src/charm.py", line 467, in <module>
    main(MongoDBCharm)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/main.py", line 435, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/main.py", line 144, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 355, in emit
    framework._emit(event)  # noqa
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 824, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 899, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/./src/charm.py", line 110, in on_mongod_pebble_ready
    container.replan()
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/model.py", line 1828, in replan
    self._pebble.replan_services()
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/pebble.py", line 1564, in replan_services
    return self._services_action('replan', [], timeout, delay)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/pebble.py", line 1646, in _services_action
    raise ChangeError(change.err, change)
ops.pebble.ChangeError: cannot perform the following tasks:
- Start service "mongod" (cannot start service: exited quickly with code 100)
----- Logs from task 0 -----
2023-02-15T21:45:18Z INFO Most recent service output:
    (...)
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"REPL",     "id":4784900, "ctx":"initandlisten","msg":"Stepping down the ReplicationCoordinator for shutdown","attr":{"waitTimeMillis":15000}}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"COMMAND",  "id":4784901, "ctx":"initandlisten","msg":"Shutting down the MirrorMaestro"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"SHARDING", "id":4784902, "ctx":"initandlisten","msg":"Shutting down the WaitForMajorityService"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"NETWORK",  "id":20562,   "ctx":"initandlisten","msg":"Shutdown: going to close listening sockets"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"NETWORK",  "id":4784905, "ctx":"initandlisten","msg":"Shutting down the global connection pool"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":4784906, "ctx":"initandlisten","msg":"Shutting down the FlowControlTicketholder"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"-",        "id":20520,   "ctx":"initandlisten","msg":"Stopping further Flow Control ticket acquisitions."}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"REPL",     "id":4784907, "ctx":"initandlisten","msg":"Shutting down the replica set node executor"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"NETWORK",  "id":4784918, "ctx":"initandlisten","msg":"Shutting down the ReplicaSetMonitor"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"SHARDING", "id":4784921, "ctx":"initandlisten","msg":"Shutting down the MigrationUtilExecutor"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"ASIO",     "id":22582,   "ctx":"MigrationUtil-TaskExecutor","msg":"Killing all outstanding egress activity."}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"COMMAND",  "id":4784923, "ctx":"initandlisten","msg":"Shutting down the ServiceEntryPoint"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":4784925, "ctx":"initandlisten","msg":"Shutting down free monitoring"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":4784927, "ctx":"initandlisten","msg":"Shutting down the HealthLog"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":4784928, "ctx":"initandlisten","msg":"Shutting down the TTL monitor"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":4784929, "ctx":"initandlisten","msg":"Acquiring the global lock for shutdown"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"-",        "id":4784931, "ctx":"initandlisten","msg":"Dropping the scope cache for shutdown"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"FTDC",     "id":4784926, "ctx":"initandlisten","msg":"Shutting down full-time data capture"}
    {"t":{"$date":"2023-02-15T21:45:18.813+00:00"},"s":"I",  "c":"CONTROL",  "id":20565,   "ctx":"initandlisten","msg":"Now exiting"}
    {"t":{"$date":"2023-02-15T21:45:18.814+00:00"},"s":"I",  "c":"CONTROL",  "id":23138,   "ctx":"initandlisten","msg":"Shutting down","attr":{"exitCode":100}}
2023-02-15T21:45:18Z ERROR cannot start service: exited quickly with code 100
-----
unit-mongodb-0: 16:45:19 ERROR juju.worker.uniter.operation hook "mongod-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-mongodb-0: 16:45:19 ERROR juju.worker.uniter pebble poll failed for container "mongod": failed to send pebble-ready event: hook failed
unit-mongodb-0: 16:45:19 ERROR unit.mongodb/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-0/charm/./src/charm.py", line 467, in <module>
    main(MongoDBCharm)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/main.py", line 435, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/main.py", line 144, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 355, in emit
    framework._emit(event)  # noqa
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 824, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/framework.py", line 899, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/./src/charm.py", line 110, in on_mongod_pebble_ready
    container.replan()
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/model.py", line 1828, in replan
    self._pebble.replan_services()
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/pebble.py", line 1564, in replan_services
    return self._services_action('replan', [], timeout, delay)
  File "/var/lib/juju/agents/unit-mongodb-0/charm/venv/ops/pebble.py", line 1646, in _services_action
    raise ChangeError(change.err, change)
ops.pebble.ChangeError: cannot perform the following tasks:
- Start service "mongod" (cannot start service: exited quickly with code 100)
----- Logs from task 0 -----
2023-02-15T21:45:19Z INFO Most recent service output:
    (...)
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"REPL",     "id":4784900, "ctx":"initandlisten","msg":"Stepping down the ReplicationCoordinator for shutdown","attr":{"waitTimeMillis":15000}}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"COMMAND",  "id":4784901, "ctx":"initandlisten","msg":"Shutting down the MirrorMaestro"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"SHARDING", "id":4784902, "ctx":"initandlisten","msg":"Shutting down the WaitForMajorityService"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"NETWORK",  "id":20562,   "ctx":"initandlisten","msg":"Shutdown: going to close listening sockets"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"NETWORK",  "id":4784905, "ctx":"initandlisten","msg":"Shutting down the global connection pool"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"CONTROL",  "id":4784906, "ctx":"initandlisten","msg":"Shutting down the FlowControlTicketholder"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"-",        "id":20520,   "ctx":"initandlisten","msg":"Stopping further Flow Control ticket acquisitions."}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"REPL",     "id":4784907, "ctx":"initandlisten","msg":"Shutting down the replica set node executor"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"NETWORK",  "id":4784918, "ctx":"initandlisten","msg":"Shutting down the ReplicaSetMonitor"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"SHARDING", "id":4784921, "ctx":"initandlisten","msg":"Shutting down the MigrationUtilExecutor"}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"ASIO",     "id":22582,   "ctx":"MigrationUtil-TaskExecutor","msg":"Killing all outstanding egress activity."}
    {"t":{"$date":"2023-02-15T21:45:19.882+00:00"},"s":"I",  "c":"COMMAND",  "id":4784923, "ctx":"initandlisten","msg":"Shutting down the ServiceEntryPoint"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":4784925, "ctx":"initandlisten","msg":"Shutting down free monitoring"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":4784927, "ctx":"initandlisten","msg":"Shutting down the HealthLog"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":4784928, "ctx":"initandlisten","msg":"Shutting down the TTL monitor"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":4784929, "ctx":"initandlisten","msg":"Acquiring the global lock for shutdown"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"-",        "id":4784931, "ctx":"initandlisten","msg":"Dropping the scope cache for shutdown"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"FTDC",     "id":4784926, "ctx":"initandlisten","msg":"Shutting down full-time data capture"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":20565,   "ctx":"initandlisten","msg":"Now exiting"}
    {"t":{"$date":"2023-02-15T21:45:19.883+00:00"},"s":"I",  "c":"CONTROL",  "id":23138,   "ctx":"initandlisten","msg":"Shutting down","attr":{"exitCode":100}}
2023-02-15T21:45:19Z ERROR cannot start service: exited quickly with code 100
-----
unit-mongodb-0: 16:45:20 ERROR juju.worker.uniter.operation hook "mongod-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-mongodb-0: 16:45:20 ERROR juju.worker.uniter pebble poll failed for container "mongod": failed to send pebble-ready event: hook failed
unit-mongodb-0: 16:45:21 INFO juju.worker.uniter.operation ran "db-storage-attached" hook (via hook dispatching script: dispatch)
unit-mongodb-0: 16:45:21 INFO juju.worker.uniter.operation ran "config-changed" hook (via hook dispatching script: dispatch)
unit-mongodb-0: 16:45:21 INFO juju.worker.uniter found queued "start" hook
unit-mongodb-0: 16:45:22 INFO unit.mongodb/0.juju-log Running legacy hooks/start.
``

Additional context

No response

MongoServerError: not primary when executing db.createUser

Pebble error when trying to create a user. This error occured after integrating the mongodb-k8s charm with a database requirer through the dabase relation.

Steps to reproduce (taken from the Charmed 5G Getting Started Guide)

Add the community repository MicroK8s addon:

sudo microk8s addons repo add community https://github.com/canonical/microk8s-community-addons --reference feat/strict-fix-multus

Enable the following MicroK8s addons. We must give MetalLB an address range that has at least 3 IP addresses for Charmed 5G.

sudo microk8s enable hostpath-storage
sudo microk8s enable multus
sudo microk8s enable metallb:10.0.0.2-10.0.0.4

Bootstrap a Juju controller

juju bootstrap microk8s

Create a Juju model:

juju add-model core

Deploy the sdcore-router-k8s operator:

juju deploy sdcore-router-k8s router --trust --channel=beta

Deploy the sdcore-k8s charm bundle:

juju deploy sdcore-k8s --trust --channel=beta

Expected behavior

DB users get created without error

Actual behavior

DB user does not get created and requirer charm stays in waiting state.

Versions

Operating system: Ubuntu 23.10
Juju CLI: 3.1.6
Juju agent: 3.1.6
Charm revision: 6/edge (rev. 38)
microk8s: v1.27.7

Log output

Partial output of juju debug-log

unit-mongodb-k8s-0: 14:31:17 ERROR unit.mongodb-k8s/0.juju-log database:7: Failed to create the operator user: non-zero exit code 1 executing ['/usr/bin/mongosh', 'mongodb://localhost/admin', '--quiet', '--eval', "db.createUser({  user: 'operator',  pwd: passwordPrompt(),  roles:[    {'role': 'userAdminAnyDatabase', 'db': 'admin'},     {'role': 'readWriteAnyDatabase', 'db': 'admin'},     {'role': 'clusterAdmin', 'db': 'admin'},   ],  mechanisms: ['SCRAM-SHA-256'],  passwordDigestor: 'server',})"], stdout='Enter password\n********************************', stderr='MongoServerError: not primary\n'
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/./src/charm.py", line 625, in _init_operator_user
    process.wait_output()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/pebble.py", line 1374, in wait_output
    raise ExecError[AnyStr](self._command, exit_code, out_value, err_value)
ops.pebble.ExecError: non-zero exit code 1 executing ['/usr/bin/mongosh', 'mongodb://localhost/admin', '--quiet', '--eval', "db.createUser({  user: 'operator',  pwd: passwordPrompt(),  roles:[    {'role': 'userAdminAnyDatabase', 'db': 'admin'},     {'role': 'readWriteAnyDatabase', 'db': 'admin'},     {'role': 'clusterAdmin', 'db': 'admin'},   ],  mechanisms: ['SCRAM-SHA-256'],  passwordDigestor: 'server',})"], stdout='Enter password\n********************************', stderr='MongoServerError: not primary\n'

Juju status:

guillaume@potiron:~$ juju status
Model  Controller      Cloud/Region        Version  SLA          Timestamp
core   microk8s-local  microk8s/localhost  3.1.6    unsupported  14:41:37-05:00

App                       Version  Status   Scale  Charm                     Channel        Rev  Address         Exposed  Message
amf                                active       1  sdcore-amf                edge            48  10.152.183.194  no       
ausf                               active       1  sdcore-ausf               edge            36  10.152.183.28   no       
grafana-agent-k8s         0.32.1   waiting      1  grafana-agent-k8s         latest/stable   42  10.152.183.51   no       installing agent
mongodb-k8s                        active       1  mongodb-k8s               6/edge          38  10.152.183.200  no       Primary
nrf                                active       1  sdcore-nrf                edge            54  10.152.183.85   no       
nssf                               active       1  sdcore-nssf               edge            33  10.152.183.79   no       
pcf                                waiting      1  sdcore-pcf                edge            27  10.152.183.98   no       installing agent
router                             active       1  sdcore-router             edge            28  10.152.183.181  no       
self-signed-certificates           active       1  self-signed-certificates  beta            33  10.152.183.87   no       
smf                                active       1  sdcore-smf                edge            31  10.152.183.55   no       
udm                                active       1  sdcore-udm                edge            31  10.152.183.94   no       
udr                                active       1  sdcore-udr                edge            26  10.152.183.40   no       
webui                              active       1  sdcore-webui              edge            18  10.152.183.196  no       

Unit                         Workload  Agent  Address      Ports  Message
amf/0*                       active    idle   10.1.182.25         
ausf/0*                      active    idle   10.1.182.1          
grafana-agent-k8s/0*         blocked   idle   10.1.182.6          grafana-cloud-config: off, logging-consumer: off
mongodb-k8s/0*               active    idle   10.1.182.7          Primary
nrf/0*                       active    idle   10.1.182.35         
nssf/0*                      active    idle   10.1.182.32         
pcf/0*                       waiting   idle   10.1.182.43         Waiting for `database` relation to be available
router/0*                    active    idle   10.1.182.34         
self-signed-certificates/0*  active    idle   10.1.182.42         
smf/0*                       active    idle   10.1.182.45         
udm/0*                       active    idle   10.1.182.38         
udr/0*                       active    idle   10.1.182.48         
webui/0*                     active    idle   10.1.182.46

Integration provider                   Requirer                            Interface          Type     Message
amf:metrics-endpoint                   grafana-agent-k8s:metrics-endpoint  prometheus_scrape  regular  
grafana-agent-k8s:logging-provider     mongodb-k8s:logging                 loki_push_api      regular  
mongodb-k8s:database                   amf:database                        mongodb_client     regular  
mongodb-k8s:database                   nrf:database                        mongodb_client     regular  
mongodb-k8s:database                   pcf:database                        mongodb_client     regular  
mongodb-k8s:database                   smf:database                        mongodb_client     regular  
mongodb-k8s:database                   udr:database                        mongodb_client     regular  
mongodb-k8s:database                   webui:database                      mongodb_client     regular  
mongodb-k8s:database-peers             mongodb-k8s:database-peers          mongodb-peers      peer     
mongodb-k8s:metrics-endpoint           grafana-agent-k8s:metrics-endpoint  prometheus_scrape  regular  
nrf:fiveg-nrf                          amf:fiveg-nrf                       fiveg_nrf          regular  
nrf:fiveg-nrf                          ausf:fiveg_nrf                      fiveg_nrf          regular  
nrf:fiveg-nrf                          nssf:fiveg_nrf                      fiveg_nrf          regular  
nrf:fiveg-nrf                          pcf:fiveg_nrf                       fiveg_nrf          regular  
nrf:fiveg-nrf                          smf:fiveg_nrf                       fiveg_nrf          regular  
nrf:fiveg-nrf                          udm:fiveg_nrf                       fiveg_nrf          regular  
nrf:fiveg-nrf                          udr:fiveg_nrf                       fiveg_nrf          regular  
self-signed-certificates:certificates  amf:certificates                    tls-certificates   regular  
self-signed-certificates:certificates  ausf:certificates                   tls-certificates   regular  
self-signed-certificates:certificates  nrf:certificates                    tls-certificates   regular  
self-signed-certificates:certificates  nssf:certificates                   tls-certificates   regular  
self-signed-certificates:certificates  pcf:certificates                    tls-certificates   regular  
self-signed-certificates:certificates  smf:certificates                    tls-certificates   regular  
self-signed-certificates:certificates  udm:certificates                    tls-certificates   regular  
self-signed-certificates:certificates  udr:certificates                    tls-certificates   regular  
smf:metrics-endpoint                   grafana-agent-k8s:metrics-endpoint  prometheus_scrape  regular

Here's the bundle that was used to generate this error:

bundle: kubernetes
name: sdcore-control-plane
description: The SD-Core Control Plane bundle contains the control plane part of the 5G core network.
applications:
  amf:
    charm: sdcore-amf
    channel: edge
    scale: 1
    trust: true
  ausf:
    charm: sdcore-ausf
    channel: edge
    scale: 1
    trust: true
  nrf:
    charm: sdcore-nrf
    channel: edge
    scale: 1
    trust: true
  nssf:
    charm: sdcore-nssf
    channel: edge
    scale: 1
    trust: true
  pcf:
    charm: sdcore-pcf
    channel: edge
    scale: 1
    trust: true
  smf:
    charm: sdcore-smf
    channel: edge
    scale: 1
    trust: true
  udm:
    charm: sdcore-udm
    channel: edge
    scale: 1
    trust: true
  udr:
    charm: sdcore-udr
    channel: edge
    scale: 1
    trust: true
  webui:
    charm: sdcore-webui
    scale: 1
    channel: edge
    trust: true
  mongodb-k8s:
    charm: mongodb-k8s
    channel: 6/edge
    scale: 1
    trust: true
  grafana-agent-k8s:
    charm: grafana-agent-k8s
    channel: latest/stable
    scale: 1
  self-signed-certificates:
    charm: self-signed-certificates
    channel: beta
    scale: 1
relations:
  - - amf:fiveg-nrf
    - nrf:fiveg-nrf
  - - amf:database
    - mongodb-k8s:database
  - - amf:metrics-endpoint
    - grafana-agent-k8s:metrics-endpoint
  - - amf:certificates
    - self-signed-certificates:certificates
  - - ausf:fiveg_nrf
    - nrf:fiveg-nrf
  - - ausf:certificates
    - self-signed-certificates:certificates
  - - nrf:database
    - mongodb-k8s:database
  - - nrf:certificates
    - self-signed-certificates:certificates
  - - nssf:fiveg_nrf
    - nrf:fiveg-nrf
  - - nssf:certificates
    - self-signed-certificates:certificates
  - - pcf:fiveg_nrf
    - nrf:fiveg-nrf
  - - pcf:database
    - mongodb-k8s:database
  - - pcf:certificates
    - self-signed-certificates:certificates
  - - smf:fiveg_nrf
    - nrf:fiveg-nrf
  - - smf:metrics-endpoint
    - grafana-agent-k8s:metrics-endpoint
  - - smf:database
    - mongodb-k8s:database
  - - smf:certificates
    - self-signed-certificates:certificates
  - - udm:fiveg_nrf
    - nrf:fiveg-nrf
  - - udm:certificates
    - self-signed-certificates:certificates
  - - udr:fiveg_nrf
    - nrf:fiveg-nrf
  - - udr:database
    - mongodb-k8s:database
  - - udr:certificates
    - self-signed-certificates:certificates
  - - webui:database
    - mongodb-k8s:database
  - - mongodb-k8s:metrics-endpoint
    - grafana-agent-k8s:metrics-endpoint
  - - mongodb-k8s:logging
    - grafana-agent-k8s:logging-provider

Ambiguous description/username

In this page, the description may be a little ambiguous because it mentions to use the admin user and "admin" as the username, but in the statement below "operator" is used. Consider mentioning that the operator is the name of the admin user (if that is correct), or to update the statement above.

"""
Retrieving the username: In this case, we are using the admin user to connect to MongoDB. Use admin as the username:

export DB_USERNAME="operator"
"""

Redeploying mongdb in the same model leads to authentication failure.

Steps to reproduce

  1. Deploy mongodb charm
  2. Use juju remove-application to remove the deployed mongodb-charm
  3. Deploy mongodb charm again

This leads to an authentication failure error

unit-mongodb-k8s-0: 17:30:08 ERROR unit.mongodb-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1394, in _get_socket
    sock_info = self.sockets.popleft()
IndexError: pop from an empty deque

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "./src/charm.py", line 347, in <module>
    main(MongoDBCharm, use_juju_for_storage=True)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 409, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 143, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 278, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 722, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 767, in _reemit
    custom_handler(event)
  File "./src/charm.py", line 105, in _on_config_changed
    self._on_update_status(event)
  File "./src/charm.py", line 160, in _on_update_status
    if not self.mongo.is_ready():
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/src/mongoserver.py", line 54, in is_ready
    client.server_info()
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1994, in server_info
    return self.admin.command("buildinfo",
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/database.py", line 751, in command
    with self.__client._socket_for_reads(
  File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1389, in _socket_for_reads
    with self._get_socket(server, session) as sock_info:
  File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/mongo_client.py", line 1308, in _get_socket
    with server.get_socket(
  File "/usr/lib/python3.8/contextlib.py", line 113, in __enter__
    return next(self.gen)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1331, in get_socket
    sock_info = self._get_socket(all_credentials)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1397, in _get_socket
    sock_info = self.connect(all_credentials)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 1297, in connect
    sock_info.check_auth(all_credentials)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 820, in check_auth
    self.authenticate(credentials)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 837, in authenticate
    auth.authenticate(credentials, self)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 672, in authenticate
    auth_func(credentials, sock_info)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 588, in _authenticate_default
    return _authenticate_scram(credentials, sock_info, 'SCRAM-SHA-256')
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/auth.py", line 333, in _authenticate_scram
    res = sock_info.command(source, cmd)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/pool.py", line 710, in command
    return command(self, dbname, spec, secondary_ok,
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/network.py", line 158, in command
    helpers._check_command_response(
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/pymongo/helpers.py", line 167, in _check_command_response
    raise OperationFailure(errmsg, code, response, max_wire_version)
pymongo.errors.OperationFailure: Authentication failed., full error: {'ok': 0.0, 'errmsg': 'Authentication failed.', 'code': 18, 'codeName': 'AuthenticationFailed', '$clusterTime': {'clusterTime': Timestamp(1631118608, 2), 'signature': {'hash': b'C?n\xd4\xa0\xe1\xc4\x14\r\x04\x98;\x08w^Y\x02\xec\xc2\xc3', 'keyId': 7005600716479791108}}, 'operationTime': Timestamp(1631118608, 2)}

Courtsey : Nashwan Azhari for reporting this.

Deploying off of `edge` leads to flaky results.

As of the time of this writing, deploying the edge release of the Operator on Charmhub (revision 6) leads to flaky deployments.

Even when deployed by itself of a fresh controller/mode, the odds are about 50-50 for the charm to actually become active.

The most common error encountered is:

ubuntu@nashu-vm:~$ juju debug-log --include mongodb-k8s/0 --replay | head -200
unit-mongodb-k8s-0: 16:02:07 WARNING juju.worker.proxyupdater unable to set snap core settings [proxy.http= proxy.https= proxy.store=]: exec: "snap": executable file not found in $PATH, output: ""
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.caasupgrader abort check blocked until version event received
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.caasupgrader unblocking abort check
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.leadership mongodb-k8s/0 promoted to leadership of mongodb-k8s
unit-mongodb-k8s-0: 16:02:07 INFO juju.agent.tools ensure jujuc symlinks in /var/lib/juju/tools/unit-mongodb-k8s-0
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.uniter unit "mongodb-k8s/0" started
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.uniter resuming charm install
unit-mongodb-k8s-0: 16:02:07 INFO juju.worker.uniter.charm downloading ch:amd64/focal/mongodb-k8s-6 from API server
unit-mongodb-k8s-0: 16:02:07 INFO juju.downloader downloading from ch:amd64/focal/mongodb-k8s-6
unit-mongodb-k8s-0: 16:02:07 INFO juju.downloader download complete ("ch:amd64/focal/mongodb-k8s-6")
unit-mongodb-k8s-0: 16:02:08 INFO juju.downloader download verified ("ch:amd64/focal/mongodb-k8s-6")
unit-mongodb-k8s-0: 16:02:12 INFO juju.worker.uniter hooks are retried true
unit-mongodb-k8s-0: 16:02:12 INFO juju.worker.uniter found queued "install" hook
unit-mongodb-k8s-0: 16:02:14 INFO unit.mongodb-k8s/0.juju-log Running legacy hooks/install.
unit-mongodb-k8s-0: 16:02:15 INFO juju.worker.uniter.operation ran "install" hook (via hook dispatching script: dispatch)
unit-mongodb-k8s-0: 16:02:19 INFO juju.worker.uniter.operation ran "mongodb-relation-created" hook (via hook dispatching script: dispatch)
unit-mongodb-k8s-0: 16:02:22 INFO juju.worker.uniter.operation ran "database-relation-created" hook (via hook dispatching script: dispatch)
unit-mongodb-k8s-0: 16:02:23 INFO juju.worker.uniter found queued "leader-elected" hook
unit-mongodb-k8s-0: 16:02:25 INFO juju.worker.uniter.operation ran "leader-elected" hook (via hook dispatching script: dispatch)
unit-mongodb-k8s-0: 16:02:28 INFO unit.mongodb-k8s/0.juju-log Restarted mongodb container
unit-mongodb-k8s-0: 16:02:39 ERROR unit.mongodb-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "./src/charm.py", line 350, in <module>
    main(MongoDBCharm, use_juju_for_storage=True)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 408, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 142, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 275, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 719, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 766, in _reemit
    custom_handler(event)
  File "./src/charm.py", line 101, in _on_config_changed
    self._on_update_status(event)
  File "./src/charm.py", line 162, in _on_update_status
    container.restart("mongodb")
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/model.py", line 1131, in restart
    self._pebble.restart_services(service_names)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/pebble.py", line 906, in restart_services
    return self._services_action('restart', services, timeout, delay)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/pebble.py", line 924, in _services_action
    raise ChangeError(change.err, change)
ops.pebble.ChangeError: cannot perform the following tasks:
- Stop service "mongodb" (process still runs after SIGTERM and SIGKILL)
unit-mongodb-k8s-0: 16:02:40 ERROR juju.worker.uniter.operation hook "mongodb-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-mongodb-k8s-0: 16:02:40 ERROR juju.worker.uniter pebble poll failed for container "mongodb": hook failed
unit-mongodb-k8s-0: 16:02:51 ERROR unit.mongodb-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "./src/charm.py", line 350, in <module>
    main(MongoDBCharm, use_juju_for_storage=True)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 408, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/main.py", line 142, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 275, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 719, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/framework.py", line 766, in _reemit
    custom_handler(event)
  File "./src/charm.py", line 96, in _on_config_changed
    container.restart("mongodb")
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/model.py", line 1131, in restart
    self._pebble.restart_services(service_names)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/pebble.py", line 906, in restart_services
    return self._services_action('restart', services, timeout, delay)
  File "/var/lib/juju/agents/unit-mongodb-k8s-0/charm/venv/ops/pebble.py", line 924, in _services_action
    raise ChangeError(change.err, change)
ops.pebble.ChangeError: cannot perform the following tasks:
- Stop service "mongodb" (process still runs after SIGTERM and SIGKILL)
unit-mongodb-k8s-0: 16:02:51 ERROR juju.worker.uniter.operation hook "mongodb-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-mongodb-k8s-0: 16:02:51 ERROR juju.worker.uniter pebble poll failed for container "mongodb": hook failed
unit-mongodb-k8s-0: 16:03:03 ERROR unit.mongodb-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
...
### Same error trace repeats indefinitely....

Environment:

  • an OpenStack VM running Ubuntu 20.04 VM deployed from the latest official Ubuntu Cloud Image
  • microk8s installed from snap: v1.22.2 (Rev 2551)
  • juju installed from snap: 2.9.17 (Rev 17553)

mongodb-k8s doesn't work in stable release 13

Bug Description

The latest stable charm release of mongodb-k8s which was released on 2022/07/26 cannot be related to any other requirer.

To Reproduce

juju add-model tests
juju deploy mongodb-k8s
juju deploy finos-legend-db-k8s --channel=edge
juju relate mongodb-k8s finos-legend-db-k8s

Environment

Locally : Ubuntu 20.04.4 LTS kernel 5.4.0-122-generic
juju version: 2.9.32-ubuntu-amd64

Relevant log output

paula@paula-server2:~$ juju add-model tests
Added 'tests' model on microk8s/localhost with credential 'microk8s' for user 'admin'

# deploying latest stable version of the charm
paula@paula-server2:~$ juju deploy mongodb-k8s
Located charm "mongodb-k8s" in charm-hub, revision 13
Deploying "mongodb-k8s" from charm-hub charm "mongodb-k8s", revision 13 in channel stable on focal

# deploying a charm that requires mongodb
paula@paula-server2:~$ juju deploy finos-legend-db-k8s --channel=edge
Located charm "finos-legend-db-k8s" in charm-hub, revision 19
Deploying "finos-legend-db-k8s" from charm-hub charm "finos-legend-db-k8s", revision 19 in channel edge on focal

# trying to add a relation between the two charms
paula@paula-server2:~$ juju relate mongodb-k8s finos-legend-db-k8s
ERROR no relations found

# deploying the latest edge version of mongodb
paula@paula-server2:~$ juju deploy mongodb-k8s --channel=edge mongodb-edge
Located charm "mongodb-k8s" in charm-hub, revision 12
Deploying "mongodb-edge" from charm-hub charm "mongodb-k8s", revision 12 in channel edge on focal

# successfully added a relation between the edge version of mongodb and the requirer
paula@paula-server2:~$ juju relate mongodb-edge finos-legend-db-k8s

paula@paula-server2:~$ juju status --relations
Model  Controller               Cloud/Region        Version  SLA          Timestamp
tests  finos-legend-controller  microk8s/localhost  2.9.32   unsupported  15:20:11Z

App                  Version  Status   Scale  Charm                Channel  Rev  Address         Exposed  Message
finos-legend-db-k8s           waiting      1  finos-legend-db-k8s  edge      19  10.152.183.192  no       installing agent
mongodb-edge                  waiting    0/1  mongodb-k8s          edge      12  10.152.183.33   no       installing agent
mongodb-k8s                   waiting      1  mongodb-k8s          stable    13                  no       service not ready yet

Unit                    Workload  Agent       Address      Ports  Message
finos-legend-db-k8s/0*  blocked   idle        10.1.134.26         requires relating to: mongodb-k8s
mongodb-edge/0*         waiting   allocating  10.1.134.29         agent initializing
mongodb-k8s/0*          waiting   idle                            service not ready yet

Relation provider      Requirer                Interface          Type     Message
mongodb-edge:database  finos-legend-db-k8s:db  mongodb_datastore  regular
mongodb-edge:mongodb   mongodb-edge:mongodb    mongodb            peer     joining
mongodb-k8s:mongodb    mongodb-k8s:mongodb     mongodb            peer


### Additional context

_No response_

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

github-actions
.github/workflows/ci.yaml
  • actions/checkout v3
  • actions/checkout v3
  • canonical/data-platform-workflows v5
  • actions/checkout v3
  • actions/download-artifact v3
.github/workflows/release.yaml
  • actions/checkout v3
  • canonical/charming-actions 2.3.0
  • actions/checkout v3
  • canonical/charming-actions 2.3.0
.github/workflows/sync_issue_to_jira.yaml
  • canonical/data-platform-workflows v2
pip_requirements
requirements.txt
  • attrs ==23.1.0
  • cffi ==1.15.1
  • cryptography ==40.0.2
  • dnspython ==2.3.0
  • importlib-resources ==5.12.0
  • jsonschema ==4.17.3
  • ops ==2.4.1
  • pkgutil-resolve-name ==1.3.10
  • pycparser ==2.21
  • pymongo ==4.3.3
  • pyrsistent ==0.19.3
  • pyyaml ==6.0.1
  • tenacity ==8.2.2
  • websocket-client ==1.5.2
  • zipp ==3.15.0
tests/integration/ha_tests/application_charm/requirements.txt
  • ops >= 1.5.0
  • tenacity ==8.2.2
  • pymongo ==4.3.3
tests/integration/relation_tests/application-charm/requirements.txt
  • ops >= 1.4.0
poetry
pyproject.toml
  • python ^3.8.10
  • ops ^2.4.1
  • pymongo ^4.3.3
  • tenacity ^8.2.2
  • cryptography ^40.0.2
  • jsonschema ^4.17.3
  • pyyaml ^6.0.1
  • black ^23.3.0
  • isort ^5.12.0
  • flake8 ^6.0.0
  • flake8-docstrings ^1.7.0
  • flake8-copyright ^0.2.4
  • flake8-builtins ^2.1.0
  • pyproject-flake8 ^6.0.0-post1
  • pep8-naming ^0.13.3
  • codespell ^2.2.4
  • coverage ^7.2.7
  • pytest ^7.3.1
  • parameterized ^0.9.0
  • lightkube ^0.13.0
  • pytest ^7.3.1
  • pytest-mock ^3.11.1
  • pytest-operator ^0.27.0
  • juju 3.2.2
  • pytest-operator-cache v5

  • Check this box to trigger a request for Renovate to run again on this repository

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.