didx-xyz / aries-cloudcontroller-python Goto Github PK
View Code? Open in Web Editor NEWA repository for leveraging Self-Sovereign Identity in applications
License: Apache License 2.0
A repository for leveraging Self-Sovereign Identity in applications
License: Apache License 2.0
The checks for accepted time values in aries_cloudcontroller/model/taa_acceptance.py
are wrong.
Currently:
@validator("time")
def time_max(cls, value):
# Property is optional
if value is None:
return
if value > -1:
raise ValueError(f"time must be less than -1, currently {value}")
return value
@validator("time")
def time_min(cls, value):
# Property is optional
if value is None:
return
if value < 0:
raise ValueError(f"time must be greater than 0, currently {value}")
return value
This constrains values to be between -1 and 0. That's simply incorrect. On the one hand this should definitely allow anything > 0. On the other, possibly < -1. Not 100% sure why and if so what values should disallowed.
At some point I would like to update all the notebooks so these could be the docs.
But currently they are all out of date.
So pointing to them in the README is likely v confusing
Hello everyone, i'm starting to use aries-cloudcontroller but i don't understand whith is the equivalent for something like this:
aca-py start \ --label Alice \ -it http 0.0.0.0 8000 \ -ot http \ --admin 0.0.0.0 11000 \ --admin-insecure-mode \ --genesis-url http://localhost:9000/genesis \ --seed Alice000000000000000000000000001 \ --endpoint http://localhost:8000/ \ --debug-connections \ --auto-provision \ --wallet-type indy \ --wallet-name Alice1 \ --wallet-key secret
What's the object i need to instanciate?. I'm trying to emulate the classic example Alice-Faber
calling send_proposal
returns the following error:
Record ID not provided.
That is rather surprising as a record ID is not required, not even defined in any of the cascading models required:
body=V20IssueCredSchemaCore -> filter=V20CredFilter -> indy=V20CredFilterIndy
V20CredFilterIndy attrs:
cred_def_id: Credential definition identifier [Optional].
issuer_did: Credential issuer DID [Optional].
schema_id: Schema identifier [Optional].
schema_issuer_did: Schema issuer DID [Optional].
schema_name: Schema name [Optional].
schema_version: Schema version [Optional].
What is going amiss here? Are the generated input typed incorrect? Is there a record ID required?
Hey guys - I was playing around with the package trying to see how it works but it kept on failing because the requirement.txt was named requires.txt and it couldn't find the dependencies to install
I went and renamed the file and moved it into the root of the folder and .tar it again as well as .gz it as well ran pip install and it worked
here is the new updated version
The trust_ping method needs updating.
Currently you must pass a msg value into this function. But this is actually an optional comment.
Also it is sent as {"content": msg}
rather than {"comment": msg}
. Suggest change msg to comment and make an optional argument.
This is old code that I wrote but now I think we should remove it. It's causing me all sorts of problems.
Basically we are tracking a representation of the connection object see models/connection.py
This is used to check whether the connection is active or not before allowing actions such as send_credential.
The problem is, unless the default_handler in the connections controller is set (
)When it's not the is_active checking prevents any function calls.
I suggest we use this issue as a reason to rethink how this should work completely and if we should even be doing this kind of thing in the controller at all.
Do we want default_handlers?
Currently when a request fails it won't throw an error. This makes it hard to detect what went wrong (404, 500, etc...). I'd like to add some error handling decorators to the API: https://uplink.readthedocs.io/en/stable/user/quickstart.html#response-and-error-handling
We need to review the code and add comprehensive logging as appropriate. We also need to test and demonstrate how these logs can be accessed by the developer using the controller.
Could we remove this print statement (turn to logs I suppose) -
The base.py file under controllers is used by all controllers and defines generic POST, GET etc methods for them.
This code originally came from the ACA-Py demo folder and has been iterated on since. However, as it is pretty foundational to the repo it would be great to give it a thorough review.
I think it is not production quality at the moment.
There is also some stuff like EVENT_LOGGER that seems left over from the ACA-Py world
see also this issue
The above patch to get issuer methods working unfortunately breaks transaction endorser mechanisms. That is because uplink fails to support Union return as is checks for the return type being a class instance, which Union is not.
This eventually/soon needs to be addressed
I did not found anything about a webhook handler embeed. Do the project have an option or alternative to this?
Line 152 - 157 of the issuer API need revisiting.
My IDE does not like them. Couple of bugs there but an easy fix. Plus anywhere else in the code base this pattern has been used.
I would just push fix to main but not sure about messing up the build etc.
Will sort on Monday or whenever
The method get_cred_def
under credential_definition errors out for return type CredentialDefinitionGetResult
. This can be remidied by using Any
type instead. That shouldn't be the solution though.
Maybe you ca have a look at that as well @TimoGlastra ?
No option to add additional query arguments:
See below:
connection_idstring($uuid)(query) | Connection identifier
rolestring(query) | Role assigned in credential exchangeAvailable values : issuer, holder
statestring(query) | Credential exchange stateAvailable values : proposal_sent, proposal_received, offer_sent, offer_received, request_sent, request_received, credential_issued, credential_received, credential_acked
thread_idstring($uuid)(query) | Thread identifier
Can also check them from the swagger API.
Note: the proofs API has similar but does support these query args. We should implement same functionality I imagine.
The method issue_credential_automated
produces validation errors for return type checks of V10CredentialExchange
. That is somewhat surpriseng as this is solely produced by aca-py. Hence, I suspect that this is a validation error/sth is not quite right in the validadtion mechanism in V10CredentialExchange.
here the relevant part of the error stack:
pydantic.error_wrappers.ValidationError: 1 validation error for V10CredentialExchange
credential_offer_dict -> offers~attach -> 0 -> data -> base64
Value of base64 does not match regex pattern ('^[a-zA-Z0-9+\/]*={0,2}$') (type=value_error)
The revocations.revoke_credential function has a bug in it.
These arguments should all contain None as default otherwise the logic doesn't work.
cred_ex_id: str = "", cred_rev_id: str = "", rev_reg_id: str = ""
We should probably think about building a new release soon as we have made a few fixes lately
There currently seems to be a bug in query_subwallets
in multitenant.py. When passing a wallet_name param it doesn't get added to the request hance returns all the wallets instead of a specific one.
This can be fairly easily fixed by adding the wallet_name into the request
Hey folks,
since we decided to use the generated uplink controller and that working alright so far I'm posing this as a question:
Codacy was introduced to keep tabs on the quality of current and new code. The new uplink version cause a severe a mount of issues though.
3289 Issues | +16 Complexity | +275 Duplication
IMHO it's fairly ridiculuous to use codacy in this way. We could either:
There seems to be a bug in the new uplink implementation caused by the model validation in the publish_schema()
method under schemas. Trying to publish a schema causes The following Type error:
File "/usr/local/lib/python3.8/abc.py", line 102, in __subclasscheck__
return _abc_subclasscheck(cls, subclass)
TypeError: issubclass() arg 1 must be a class
Initially I discovered this when using the uplink controller in the cloudapi app and thought this might have been caused by some fastapi code also. However, I found this to be a problem with the uplink implementation after trying this with an AcaPyClient instance straight from python shell:
cl= aries_cloudcontroller.AcaPyClient(base_url="localhost:3021", api_key="adminApiKey")
>>> from aries_cloudcontroller import SchemaSendRequest
>>> schema_definition = SchemaSendRequest(
... attributes=['hello', 'world'],
... schema_name='planet',
... schema_version="0.1"
... )
>>> cl.schema.publish_schema(body=schema_definition)
<coroutine object SchemaApi.publish_schema at 0x7fafba05ed40>
>>> await cl.schema.publish_schema(body=schema_definition)
File "<stdin>", line 1
SyntaxError: 'await' outside function
>>> import asyncio
>>> loop = asyncio.get_event_loop()
>>> loop.run_until_complete(schema_definition = SchemaSendRequest(
... attributes=['hello', 'world'],
... schema_name='planet',
... schema_version="0.1"
...
...
...
KeyboardInterrupt
>>> loop.run_until_complete(cl.schema.publish_schema(body=schema_definition))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "/usr/local/lib/python3.8/site-packages/aries_cloudcontroller/api/schema.py", line 57, in publish_schema
return await self.__publish_schema(
File "/usr/local/lib/python3.8/site-packages/uplink/builder.py", line 95, in __call__
self._request_definition.define_request(request_builder, args, kwargs)
File "/usr/local/lib/python3.8/site-packages/uplink/commands.py", line 287, in define_request
self._method_handler.handle_builder(request_builder)
File "/usr/local/lib/python3.8/site-packages/uplink/decorators.py", line 62, in handle_builder
annotation.modify_request(request_builder)
File "/usr/local/lib/python3.8/site-packages/uplink/returns.py", line 64, in modify_request
converter = request_builder.get_converter(
File "/usr/local/lib/python3.8/site-packages/uplink/helpers.py", line 96, in get_converter
return self._converter_registry[converter_key](*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/uplink/converters/__init__.py", line 54, in __call__
converter = self._converter_factory(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/uplink/converters/__init__.py", line 114, in chain
converter = func(factory)(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/uplink/converters/typing_.py", line 127, in create_response_body_converter
return self._base_converter(type_)
File "/usr/local/lib/python3.8/site-packages/uplink/converters/typing_.py", line 121, in _base_converter
if issubclass(type_.__origin__, self.typing.Sequence):
File "/usr/local/lib/python3.8/typing.py", line 774, in __subclasscheck__
return issubclass(cls, self.__origin__)
File "/usr/local/lib/python3.8/abc.py", line 102, in __subclasscheck__
return _abc_subclasscheck(cls, subclass)
TypeError: issubclass() arg 1 must be a class
>>>
Interestingly, the stdout stream of the ledger shows that the schema does get published:
14:35:14,523 indy.libindy.native.indy.services.ledger INFO src/services/ledger/mod.rs:143 | build_schema_request() => Ok("{\"reqId\":1630593314522925994,\"identifier\":\"UXmTgD2HaqMkQKaPVUsJzD\",\"operation\":{\"type\":\"101\",\"data\":{\"name\":\"planet\",\"version\":\"0.1\",\"attr_names\":[\"world\",\"hello\"]}},\"protocolVersion\":2}")
This problem has been encountered - see pydantic here and fastapi here. So this appears to be a pydantic model/type checking issue that is possibly based on a Python standard library bug
@TimoGlastra You have pointed out that it likely is a problem here
I agree with that partly because the fact that the ledger does receive the definition implies that error is raised by the or one of the return types/models of the __publish_schema
method.
Any thoughts and suggestions on how to fix that would be great.
Most important to me is the latter question since the models are autogenerated so working around this manually in the model definitions would not be ideal. It would get overwritten by any consecutive release been created and used.
IndyRequestedCredsRequestedPred timestamp validator should only have > 0 check
The from_
and to
attrs validators seem to be wrong in non_revoked
. Following error encountered:
pydantic.error_wrappers.ValidationError: 2 validation errors for IndyProofRequest
E requested_attributes -> additionalProp1 -> non_revoked -> from
E from_ must be less than -1, currently 1640995199 (type=value_error)
E requested_attributes -> additionalProp1 -> non_revoked -> to
E to must be less than -1, currently 1640995199 (type=value_error)
similar errors have been encountered before. The value should not as the validator suggests in aries_cloudcontroller/model/indy_proof_req_attr_spec_non_revoked.py
be less than -1 (to) and larger 0 (from_). Both should just be positive integers(?) since according to aca-py swagger these values are seconds since unix epoch values.
actually this just seems off:
from aries_cloudcontroller/model/indy_proof_req_attr_spec_non_revoked.py
@validator("from_")
def from__max(cls, value):
# Property is optional
if value is None:
return
if value > -1:
raise ValueError(f"from_ must be less than -1, currently {value}")
return value
@validator("from_")
def from__min(cls, value):
# Property is optional
if value is None:
return
if value < 0:
raise ValueError(f"from_ must be greater than 0, currently {value}")
return value
@validator("to")
def to_max(cls, value):
# Property is optional
if value is None:
return
if value > -1:
raise ValueError(f"to must be less than -1, currently {value}")
return value
@validator("to")
def to_min(cls, value):
# Property is optional
if value is None:
return
if value < 0:
raise ValueError(f"to must be greater than 0, currently {value}")
return value
Currently the build and publish pipeline runs on all branches. That is not the desired behaviour and moreover annoying.
I'll have a go at this myself asap.
Currently if the body is optional and no parameters are provided you still need to provide an empty body. We should update the cloud controller to initialize the body parameter class as the default value if the body is optional
Hey,
Just been playing.
It seems I can run init_webhook_server, without passing in any arguments and it's doesn't throw and error.
Feels like we need some better handling of this. Like when I init the webhook server I want to know its been initalised successfully really. I again feel myself thinking that perhaps init and listen should be combined? Or at least some check and confirmation that the right arguments have been passed in.
Thoughts?
The types for role
in both credential exchange records v10 and v20 assume options issuer
and holder
. Using the controller, however, I get an error in proofs flow that the role unexpectedly is prover
.
Is prover
a valid role from aca-py or is something else going wrong?
I'm not sure so I put this as a question, athough I do think the fix should probably just be adding prover
to the options of roles in V10CredentialExchange
and V20CredExRecord
.
validator for V20CredFilterIndy seems off. See also #55 #56 #57 #58. Simply skipping this validation enables a successful workflow (given correct input, produces correct output against aca-py)
@validator("cred_def_id")
def cred_def_id_pattern(cls, value):
# Property is optional
if value is None:
return
pattern = r"^([123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]{21,22}):3:CL:(([1-9][0-9]*)|([123456789ABCDEFGHJKLMNPQRSTUVWXYZabcdefghijkmnopqrstuvwxyz]{21,22}:2:.+:[0-9.]+)):(.+)?$"
if not re.match(pattern, value):
raise ValueError(
f"Value of cred_def_id does not match regex pattern ('{pattern}')"
)
return value
Work has been undertaken in ACA-Py to explore supporting credential issuance for large creds 1 < 16Mb.
The current default for an aiohttp web.Application() is to support 1Mb max client request size. ACA-Py have added this additional config arg to change this on the agent https://github.com/hyperledger/aries-cloudagent-python/blob/68347f8e1f3b918464019bdc2d891926de023f5c/aries_cloudagent/config/argparse.py#L164
For our webhook server to receive and process events from an agent processing messages of this size, we would need to support somewhere in the order of 4x the credential size. This data gets represented multiple times in the webhook payload by the time the protocol gets into the credential_acked state.
For now we could just add an optional argument to specify the client_max_size when initialising the webhook?
The problem-report endpoint is used by present-proof and issue-credential and maybe others.
Currently the proofs API writes it like this
async def send_problem_report(self, pres_ex_id, request):
return await self.admin_POST(f"{self.base_url}/records/{pres_ex_id}/problem-report", json_data=request)
This is confusing because it should not be request.
Request is actually just:
{
"explain_ltxt": "string"
}
Suggest we update request to explanation: str. And build the json object within the function.
Lets say you have the following method and you call it with auto_accept=True
, it will acutally call the connections/create-invitation
method with auto_accept=True
instead of auto_accept=true
@returns.json
@json
@post("/connections/create-invitation")
def __create_invitation(
self,
*,
alias: Query = None,
auto_accept: Query = None,
multi_use: Query = None,
public: Query = None,
body: Body(type=CreateInvitationRequest) = {}
) -> InvitationResult:
"""Internal uplink method for create_invitation"""
Another api bug to fix.
In the issue API the store_credential function takes in a record_id and a credential_id. The credential_id is used to create a body that is sent with the API call. However, this is actually optional. Wheras in our controller it is not.
Should be a minor fix.
Currently the create_invitation function specifies optional parameters public
and multi_use
. Both inputs are specified as String values.
This is not in line with the Swagger and while it works if I pass in a string public="true" they should be boolean values.
@kaysiz if you are going through the API anyway be good to pick this up along the way.
It would be quite handy being able to use the controller as a context manager like so:
async with aries_cloudcontroller(args) as controller:
....
sth = await controller.some_method(some_args)
....
return cool_result
Why is that useful?
__enter__
and __exit__
(so we no longer need to worry about terminating the controller explicitly)This is not urgent right now as we can always resort to the decorator contextlib
provides. But nice to have probably.
Currently using python3.8 causes the following warnings (when running the tests via pytest):
site-packages/uplink/clients/io/asyncio_strategy.py:26: DeprecationWarning: "@coroutine" decorator is deprecated since Python 3.8, use "async def" instead
There is a related issue in uplink opne issue that has some issues with using uplink with python3.8. This is another issue.
We should keep an eye on thsi as it will likely cause problems in the future.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.