GithubHelp home page GithubHelp logo

xblock-lti-consumer's Introduction

LTI Consumer XBlock

Purpose

This XBlock implements the consumer side of the LTI specification enabling integration of third-party LTI provider tools.

Getting Started

Installation

For details regarding how to deploy this or any other XBlock in the lms instance, see the installing-the-xblock documentation.

Installing in Docker Devstack

Assuming that your devstack repo lives at ~/code/devstack and that edx-platform lives right alongside that directory, you'll want to checkout xblock-lti-consumer and have it live in ~/code/src/xblock-lti-consumer. This will make it so that you can access it inside an LMS container shell and easily make modifications for local testing.

You will have to run the below instructions twice, once for the LMS and once for Studio. Otherwise you will be using different versions of the xblock in the two containers.

Run make dev.shell.lms or make dev.shell.studio from your devstack directory to enter a running container. Once in there, you can do the following to have your devstack pointing at a local development version of xblock-lti-consumer:

$ pushd /edx/src/xblock-lti-consumer
$ virtualenv venv/
$ source venv/bin/activate
$ make install
$ make test  # optional, if you want to see that everything works
$ deactivate
$ pushd  # should take you back to /edx/app/edxapp/edx-platform
$ pip uninstall -y lti_consumer_xblock
$ pip install -e /edx/src/xblock-lti-consumer

Enabling in Studio

You can enable the LTI Consumer XBlock in Studio through the advanced settings.

  1. From the main page of a specific course, navigate to Settings -> Advanced Settings from the top menu.
  2. Check for the advanced_modules policy key, and add "lti_consumer" to the policy value list.
  3. Click the "Save changes" button.

Developing

One Time Setup

# Clone the repository
git clone [email protected]:openedx/xblock-lti-consumer.git
cd xblock-lti-consumer

# Set up a virtualenv using virtualenvwrapper with the same name as the repo and activate it
mkvirtualenv -p python3.8 xblock-lti-consumer

Every time you develop something in this repo

# Activate the virtualenv
workon xblock-lti-consumer

# Grab the latest code
git checkout master
git pull

# Install/update the dev requirements
make install

# Run the tests (to verify the status before you make any changes)
make test

# Make a new branch for your changes
git checkout -b <your_github_username>/<short_description>

# Using your favorite editor, edit the code to make your change.
vim ...

# Changes to style rules should be made to the Sass files, compiled to CSS,
# and committed to the git repository.
make compile-sass

# Run your new tests
pytest ./path/to/new/tests

# Run quality checks
make quality

# Add a changelog entry to CHANGELOG.rst

# Commit all your changes
git commit ...
git push

# Open a PR and ask for review.

Package Requirements

setup.py contains a list of package dependencies which are required for this XBlock package. This list is what is used to resolve dependencies when an upstream project is consuming this XBlock package. requirements.txt is used to install the same dependencies when running the tests for this package.

Further Development Info

See the developer guide for implementation details and other developer concerns.

Testing

Unit Testing

  • To run all of the unit tests at once, run make test
  • To run tests on individual files in development, run python ./test.py -k=[name of test file without .py]
  • For example, if you want to run the tests in test_permissions.py, run python ./test.py -k=test_permissions
  • To run a specific test in a file, run something like python ./test.py -k=test_permissions.TestClass.test_function

Testing Against an LTI Provider

LTI 1.1

http://lti.tools/saltire/ provides a "Test Tool Provider" service that allows you to see messages sent by an LTI consumer.

We have some useful documentation on how to set this up here: http://edx.readthedocs.io/projects/open-edx-building-and-running-a-course/en/latest/exercises_tools/lti_component.html#lti-authentication-information

  1. In Studio Advanced settings, set the value of the "LTI Passports" field to "test:test:secret" - this will set the OAuth client key and secret used to send a message to the test LTI provider.
  2. Create an LTI Consumer problem in a course in Studio (after enabling it in "advanced_modules" as seen above). Make a unit, select "Advanced", then "LTI Consumer".
  3. Click edit and fill in the following fields: LTI ID: "test" LTI URL: "https://lti.tools/saltire/tp"
  4. Click save. The unit should refresh and you should see "Passed" in the "Verification" field of the message tab in the LTI Tool Provider emulator.
  5. Click the "Publish" button.
  6. View the unit in your local LMS. If you get an ImportError: No module named lti_consumer, you should docker-compose restart lms (since we previously uninstalled the lti_consumer to get the tests for this repo running inside an LMS container). From here, you can see the contents of the messages that we are sending as an LTI Consumer in the "Message Parameters" part of the "Message" tab.

LTI 1.3

IMS Global provides a reference implementation of LTI 1.3 that can be used to test the XBlock.

On LTI 1.3 the authentication mechanism used is OAuth2 using the Client Credentials grant, this means that to configure the tool, the LMS needs to know the keyset URL or public key of the tool, and the tool needs to know the LMS's one.

Instructions:

  1. Set up a local tunnel (using ngrok or a similar tool) to get a URL accessible from the internet.

  2. Add the following settings to edx-platform/lms/envs/private.py and edx-platform/cms/envs/private.py:

  3. Create a new course, and add the lti_consumer block to the advanced modules list.

  4. In the course, create a new unit and add the LTI block.

    • Set LTI Version to LTI 1.3.
    • Set the Tool Launch URL to https://lti-ri.imsglobal.org/lti/tools/
  5. In Studio, you'll see a few parameters being displayed in the preview:

Client ID: f0532860-cb34-47a9-b16c-53deb077d4de
Deployment ID: 1
# Note that these are LMS URLS
Keyset URL: http://1234.ngrok.io/api/lti_consumer/v1/public_keysets/88e45ecbd-7cce-4fa0-9537-23e9f7288ad9
Access Token URL: http://1234.ngrok.io/api/lti_consumer/v1/token/8e45ecbd-7cce-4fa0-9537-23e9f7288ad9
OIDC Callback URL: http://localhost:18000/api/lti_consumer/v1/launch/
  1. Set up a tool in the IMS Global reference implementation (https://lti-ri.imsglobal.org/lti/tools/).
  2. Go back to Studio, and edit the block adding its settings (you'll find them by scrolling down https://lti-ri.imsglobal.org/lti/tools/ until you find the tool you just created):
Tool Launch URL: https://lti-ri.imsglobal.org/lti/tools/[tool_id]/launches
Tool Initiate Login URL: https://lti-ri.imsglobal.org/lti/tools/[tool_id]/login_initiations
Tool Public key: Public key from key page.
  1. Publish block, log into LMS and navigate to the LTI block page.
  2. Click Send Request and verify that the LTI launch was successful.

LTI Advantage Features

This XBlock supports LTI 1.3 and the following LTI Avantage services:

  • Deep Linking (LTI-DL)
  • Assignments and Grades services (LTI-AGS)
  • Names and Roles Provisioning services (LTI-NRP)

To enable LTI-AGS, you need to set LTI Assignment and Grades Service in Studio to allow tools to send back grades. There's two grade interaction models implemented:

  • Allow tools to submit grades only (declarative)(Default): enables LTI-AGS and creates a single fixed LineItem that the tools can send grades too.
  • Allow tools to manage and submit grades (programmatic): enables LTI-AGS and enables full access to LTI-AGS endpoints. Tools will be able to create, manage and delete multiple LineItems, and set multiple grades per student per problem. In this implementation, the tool is responsible for managing grades and linking them in the LMS.

To enable LTI-DL and its capabilities, you need to set these settings in the block:

  1. Locate the Deep linking setting and set it to True (enabled).
  2. Set Deep Linking Launch URL setting. You can retrieve it from the tool you’re integrating with. If it’s not provided, try using the same value as in the LTI 1.3 Tool Launch URL.

To enable LTI-NRPS, you set Enable LTI NRPS to True in the block settings on Studio.

LTI 1.1/1.2 Basic Outcomes Service 1.1

This XBlock supports LTI 1.1/1.2 Basic Outcomes Service 1.0. Please see these LTI 1.1/1.2 Basic Outcomes Service 1.0 instructions for testing the LTI 1.1/1.2 Basic Outcomes Service 1.1 implementation.

LTI 2.0 Result Service 2.0

This XBlock supports LTI 2.0 Result Service 2.0. Please see the LTI 2.0 Result Service 2.0 instructions for testing the LTI 2.0 Result Service 2.0 implementation.

LTI Reusable configuration

The LTI Consumer XBlock supports configuration reusability via plugins. It is compatible with both LTI 1.1 and LTI 1.3. All values (including the access token and keyset URL for LTI 1.3) are shared across the XBlocks with the same external configuration ID. This eliminates the need to have a tool deployment for each XBlock.

How to Setup

  1. Install and setup the openedx-ltistore plugin on the LMS and Studio.
  2. Go to LMS admin -> WAFFLE_UTILS -> Waffle flag course override (http://localhost:18000/admin/waffle_utils/waffleflagcourseoverridemodel/).
  3. Create a waffle flag course override with these values:
    • Waffle flag: lti_consumer.enable_external_config_filter
    • Course id: <your course id>
    • Override choice: Force On
    • Enabled: True
  4. Create a new external LTI configuration and use it in the XBlock. This is explained in the README of the openedx-ltistore repository.

Getting Help

If you're having trouble, we have discussion forums at https://discuss.openedx.org where you can connect with others in the community.

Our real-time conversations are on Slack. You can request a Slack invitation, then join our community Slack workspace.

For anything non-trivial, the best path is to open an issue in this repository with as many details about the issue you are facing as you can provide.

https://github.com/openedx/xblock-lti-consumer/issues

For more information about these options, see the Getting Help page.

License

The code in this repository is licensed under the AGPL v3 License unless otherwise noted.

Please see LICENSE.txt for details.

Contributing

Contributions are very welcome. Please read How To Contribute for details.

This project is currently accepting all types of contributions, bug fixes, security fixes, maintenance work, or new features. However, please make sure to have a discussion about your new feature idea with the maintainers prior to beginning development to maximize the chances of your change being accepted. You can start a conversation by creating a new issue on this repo summarizing your idea.

The Open edX Code of Conduct

All community members are expected to follow the Open edX Code of Conduct.

People

The assigned maintainers for this component and other project details may be found in Backstage. Backstage pulls this data from the catalog-info.yaml file in this repo.

Reporting Security Issues

Please do not report security issues in public. Please email [email protected].

xblock-lti-consumer's People

Contributors

aht007 avatar ahtishamshahid avatar alangsto avatar ashultz0 avatar connorhaugh avatar cpennington avatar davidjoy avatar dianakhuang avatar edx-requirements-bot avatar feanil avatar giovannicimolin avatar iamsobanjaved avatar ilee2u avatar jawayria avatar kuipumu avatar michaelroytman avatar mikix avatar nedbat avatar noraiz-anwar avatar qubad786 avatar sarina avatar shadinaif avatar shimulch avatar stvstnfrd avatar tecoholic avatar usamasadiq avatar varshamenon4 avatar xitij2000 avatar zacharis278 avatar ziafazal avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xblock-lti-consumer's Issues

PII Sharing in LTI 1.3 Works Inconsistently

PII sharing in LTI 1.3 launches works inconsistently and potentially has a few bugs.

How Do We Enable PII Sharing?

As a review, there are two key pieces of data that control when PII is shared in LTI. This summary comes from reading the code and from the Unified Flag for Enabling Sharing of PII in LTI ADR. It summarizes how these data are used in LTI 1.1.

  • CourseAllowPIISharingInLTIFlag: a configuration model that controls whether two LtiConsumerXBlock fields ask _to_send_username and ask_to_send_email are editable in Studio and whether their values are respected for PII sharing. If this model not enabled, PII should not be shared, regardless of the value of the aforementioned fields.
  • ask_to_send_username, ask_to_send_email: two LtiConsumerXblock fields that are editable in Studio if CourseAllowPIISharingInLTIFlag is enabled; they control whether to send the username and email claims in the LTI ID token, respectively.

How Does PII Sharing Work in LTI 1.3?

LTI 1.3 Resource Link Launch

  • Neither the username nor email claim is sent in the LTI ID token, regardless of the values of the pieces of data above.
  • If either or both of ask_to_send_username or ask_to_send_email are enabled, a PII sharing consent modal appears before the LTI launch. Even though this consent modal is displayed, no such data is ever shared.
  • The CourseAllowPIISharingInLTIFlag is not used to determine whether to display this modal - only ask_to_send_username and ask_to_send_email.
  • The PII sharing consent modal appear before an LTI 1.3 launch only for modal or new_window launches, not inline launches.

LTI 1.3 Names and Role Provisioning Context Membership Service Call

  • CourseAllowPIISharingInLTIFlag is used to determine whether to send PII, but ask_to_send_username and ask_to_send_email are not used; PII is shared if CourseAllowPIISharingInLTIFlag is enabled
  • The data that are sent are name and email, not username and email. The ask_to_send_username and ask_to_send_email fields would suggest that we should send username and email (even though these fields are not used).

Recommendations

  • CourseAllowPIISharingInLTIFlag should also control whether the PII sharing consent modal is displayed.
  • The PII sharing consent modal should appear before inline LTI launches as well.
  • When CourseAllowPIISharingInLTIFlag is enabled, the LTI launch should include PII, depending on the values of ask_to_send_username and ask_to_send_email.
  • The LTI 1.3 Names and Role Provisioning Context Membership Service endpoint should send username and email if and only if CourseAllowPIISharingInLTIFlag is enabled and ask_to_send_username or ask_to_send_email are enabled, respectively.
  • The LTI 1.3 Names and Role Provisioning Context Membership Service endpoint should send a username claim instead of a name claim when ask_to_send_username is enabled.

Questions

  • What is the history and set of decisions underlying this PII sharing behavior? Was it intentionally implemented this way?
  • Can PII be shared in LTI 1.3 launches? Is there any history of legal concerns?
  • Is there a reason that the PII sharing consent modal does not appear for inline launches? Is this intentional or unintentional? Is it a limitation of the frontend technology?
  • We would like to extend PII sharing to include other user identity claims that tools can reasonably expect - see list here, like name. Is this of any concern?

Potential creation of duplicate LtiConfiguration models

Problem

On the edx.org site we have noticed a small number of LtiConfiguration objects show up that are duplicates for the same block location. This should not be possible since we only every create one LtiConfiguration per block based on the get_or_create call referenced below. This in the only function that can create an LtiConfiguration instance.

lti_config, _ = LtiConfiguration.objects.get_or_create(location=block_location)

However, this constraint is only enforced in code and not at the database level. If multiple calls to this function are received concurrently on a site with multiple workers, it is possible to end up with multiple objects.

I believe this is the cause for two reasons:

  1. All duplicate instances have adjacent database ids
  2. During a period of degraded site performance or even a bad client network connection it is easy to POST multiple requests. The modal to save the lti configuration does not close or disable while a request is in flight.

Impact

If there are multiple configurations with the same location identifier the LTI unit will not render due to a lti_consumer.models.LtiConfiguration.MultipleObjectsReturned error when any code attempts to reference that config.

Possible Solutions

  1. Add a database constraint to enforce the location attribute is unique.
    • This might be tricky to release since the new constraint will fail without manually cleaning up this data. It's possible other site unknowingly have this problem.
  2. Alter the UI so it's less likely a user could submit the same configuration multiple times.
    • Doesn't fix the underlying issue but may make it less likely to occur

Cannot access LTI Deep Link response in OpenEdx

Hi there, I was trapped in some errors about deploying an LTI tools in edx-platform.

I run both edx and LTI tools on a local machine. Edx uses port 8000 and 8001, and lti tools uses 9001.

For edx, I use:

tutor mounts add ./edx-platform
tutor images build openedx-dev
tutor dev launch

which is in tutor dev

For LTI Tools, I use: pylti1.3-flask-example, which is running on 127.0.0.1:9001

So, When I tried to launch deep link in openedx. It returns Please check that you have course staff permissions and double check this block's LTI settings. I am sure that I was using a staff account. And I am quite sure I have set DCS_SESSION_COOKIE_SAMESITE = 'None' in edx-plarform/lms/env/.

I noticed that #218 mentioned the similar problems.

The error log in edx is below:

lms-1            | 2024-06-19 17:10:13,641 WARNING 114 [lti_consumer.plugin.views] [user None] [ip 172.19.X.X] views.py:496 - Permission on LTI Config <LtiConfiguration: [CONFIG_ON_XBLOCK] lti_1p3 - block-v1:edX+DemoX+Demo_Course+type@lti_consumer+block@ba1ec764026346b39f2cb233dce4f01a> denied for user <SimpleLazyObject: <django.contrib.auth.models.AnonymousUser object at 0x7f270d1e75e0>>:
lms-1            | 2024-06-19 17:10:13,648 DEBUG 114 [django.db.backends] [user None] [ip 172.19.X.X] utils.py:161 - (0.000) COMMIT; args=None; alias=default
lms-1            | [19/Jun/2024 17:10:13] "POST /api/lti_consumer/v1/lti/8/lti-dl/response HTTP/1.1" 403 7093

I masked my ip address in error log. How to solve this problem?

Thanks.

Studio Internal Error and LMS Error 500 with LTI Consumer XBlock on Local Tutor

Hi all,

I have a running local installation of Tutor and I want to use edX as an LTI consumer.
Because the default edX LTI module does not send several key pieces of information (such as "lis_person_name_full") to the LTI Tool, I decided to install the following XBlocks: xblock-lti-consumer and tahoe-lti which easily allow to pass such information. I followed the steps described in the Tutor documentation and everything went smoothly.

Before this installation, the default LTI module was working fine (except for the missing information) but after installation I obtain an Internal Error when launching the LTI Tool from Studio and a 500 Error when launching the LTI Tool from the LMS. Below are the logs that I obtain (both for the CMS and LMS), it seems there is an unknown parameter 'lti_consumer_lticonfiguration.config_id' that is requested at some point.

2021-03-04 22:16:13,759 INFO 9 [tracking] [user 3] [ip myip] logger.py:42 - {"name": "/preview/xblock/block-v1:+CINI01+2021T1+type@lti_consumer+block@6533e2f3e6de4ed682309994ff1640a1/handler/lti_launch_handler", "context": {"user_id": 3, "path": "/preview/xblock/block-v1:+CINI01+2021T1+type@lti_consumer+block@6533e2f3e6de4ed682309994ff1640a1/handler/lti_launch_handler", "course_id": "", "org_id": ""}, "username": "Scousix", "session": "9a85478154d9c1c38aa03f209b618792", "ip": "myip", "agent": "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:86.0) Gecko/20100101 Firefox/86.0", "host": "my.url", "referer": "https://my.url/container/block-v1:+CINI01+2021T1+type@vertical+block@80a74d2fd74e46f8841cc62be7df8873?action=new", "accept_language": "en-US,en;q=0.5", "event": "{"GET": {}, "POST": {}}", "time": "2021-03-04T22:16:13.758999+00:00", "event_type": "/preview/xblock/block-v1:+CINI01+2021T1+type@lti_consumer+block@6533e2f3e6de4ed682309994ff1640a1/handler/lti_launch_handler", "event_source": "server", "page": null}
cms_1 | 2021-03-04 22:16:13,783 ERROR 9 [cms.djangoapps.contentstore.views.preview] [user 3] [ip myip] preview.py:89 - error processing ajax call
cms_1 | Traceback (most recent call last):
cms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
cms_1 | return self.cursor.execute(sql, params)
cms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 71, in execute
cms_1 | return self.cursor.execute(query, args)
cms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
cms_1 | res = self._query(query)
cms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
cms_1 | db.query(q)
cms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
cms_1 | _mysql.connection.query(self, query)
cms_1 | MySQLdb._exceptions.OperationalError: (1054, "Unknown column 'lti_consumer_lticonfiguration.config_id' in 'field list

[04/Mar/2021:22:29:25 +0000] http://my.url "GET /courses/course-v1:+CINI01+2021T1/xblock/block-v1:+CINI01+2021T1+type@lti_consumer+block@b1196804ffdc4fbf9cf5c085db88d203/handler/lti_launch_handler HTTP/1.1" 500 9705 "https://my.url/courses/course-v1:+CINI01+2021T1/courseware/717b003631c84114b86b3d07a9f90818/ccb1bfab5bdf49bca506f753a253fb16/?activate_block_id=block-v1%%2BCINI01%2B2021T1%2Btype%40sequential%2Bblock%40ccb1bfab5bdf49bca506f753a253fb16" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:86.0) Gecko/20100101 Firefox/86.0" "myip"
lms_1 | num = len(clone)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 256, in len
lms_1 | self._fetch_all()
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 1242, in _fetch_all
lms_1 | self._result_cache = list(self._iterable_class(self))
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/models/query.py", line 55, in iter
lms_1 | results = compiler.execute_sql(chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/models/sql/compiler.py", line 1142, in execute_sql
lms_1 | cursor.execute(sql, params)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/utils.py", line 67, in execute
lms_1 | return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
lms_1 | return executor(sql, params, many, context)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
lms_1 | return self.cursor.execute(sql, params)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/utils.py", line 89, in exit
lms_1 | raise dj_exc_value.with_traceback(traceback) from exc_value
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
lms_1 | return self.cursor.execute(sql, params)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 71, in execute
lms_1 | return self.cursor.execute(query, args)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
lms_1 | res = self._query(query)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
lms_1 | db.query(q)
lms_1 | File "/openedx/venv/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
lms_1 | _mysql.connection.query(self, query)
lms_1 | django.db.utils.OperationalError: (1054, "Unknown column 'lti_consumer_lticonfiguration.config_id' in 'field list'")

Any idea why this may happen? Thanks a lot in advance for your help.

XBlock rendering breaks when using an non-existent external configuration

Steps to reproduce

  1. Go to Studio and create a new LTI Consumer XBlock.
  2. Set the Configuration Type to Reusable Configuration.
  3. Provide a random LTI Reusable Configuration ID.

What happens

The XBlock break with Error: (1048, "Column 'version' cannot be null"). The relevant part of the traceback is:

File "/edx/shared-src/xblock-lti-consumer/lti_consumer/lti_xblock.py", line 1177, in author_view
  return self.student_view(context)
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/lti_xblock.py", line 1227, in student_view
  context.update(self._get_context_for_template())
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/lti_xblock.py", line 1730, in _get_context_for_template
  lti_consumer = self._get_lti_consumer()
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/lti_xblock.py", line 1128, in _get_lti_consumer
  return get_lti_consumer(config_id_for_block(self))
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/api.py", line 96, in config_id_for_block
  config = _get_lti_config_for_block(block)
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/api.py", line 76, in _get_lti_config_for_block
  lti_config = _get_or_create_local_lti_config(
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/api.py", line 46, in _get_or_create_local_lti_config
  lti_config.save()
File "/edx/shared-src/xblock-lti-consumer/lti_consumer/models.py", line 313, in save
  super().save(*args, **kwargs)

What should happen?

The XBlock should display an error about an invalid external configuration ID.

Note: Ensure that students (i.e., users without staff permissions) cannot see this error in LMS.

Originally posted by @Agrendalath in #390 (comment)

Test xblock-lti-consumer on Python 3.11

This repository is a depedency of edx-platform and needs to be upgraded to Python 3.11 before
the Readwood release is cut (mid-April).

  • Requirements are compiled with Python 3.8
  • Tests are run on Python 3.8 and 3.11
  • (Optional) Tests are also run with 3.12 and passing or 3.12 issues are ticketed.
  • Classifiers in setup.py setup.cfg or pyproject.toml indicate Python 3.11 support
  • A new version is release to PyPI
  • A PR is merged to edx-platform to use the new version

Custom Parameters not working in LTI 1.3 DL Launch

Hi there,
I am trying to send the user email to my tool in LTI1.3 launch but it looks like it haven't been done on your side & custom parameters option do not work. kindly resolve this as this is a very basic requirement for tools

Request user’s username

How can I enable the "Request user’s username" in OpenEdx?

Docs says:

To make this setting available, contact your edX partner manager.

XBlock isn't completable

LTI Consumer XBlock should be competable like other problem types; i.e., implement the xblock.completable.CompletableXBlockMixin

Previously supported custom parameters not passing validation

#392 introduced a breaking change that prevents previously allowed custom parameters from passing validation.

Example of custom parameter that now does not pass validation:

["next=something?repo=https://github.com/org/repo?folder=/home/user&branch=main"]

Changes to the custom parameter validation should be backwards compatible, and account for cases like this where the user passes in a single string.

Missing LTI version required parameters

The IMS certification for LTI 1.1 failed because of two missing parameters in the LTI launch message:

  1. tool_consumer_info_version
  2. tool_consumer_info_product_family_code

While those values don't seem to affect the LTI launches, they are required and need to be in the launch to conform to the spec.

Refactor LTI consumer classes to enable wider use of LTI features

Context

The current implementation of LTI specifications uses a set of "LTI consumer" classes. These classes define the various LTI 1.3 features: basic LTI 1.3 functionality, LTI 1.3 Advantage Services, and LTI 1.3 Proctoring Services. These consumer classes follow an inheritance model.

  • LtiConsumer1p3 is the base class and implements a basic LTI 1.3 launch.
  • LtiAdvantageConsumer extends LtiConsumer1p3 and implements an LTI 1.3 Advantage Services launch.
    • Note that basic LTI 1.3 launches use LtiAdvantageConsumer, not LtiConsumer1p3.
  • LtiProctoringConsumer extends LtiAdvantageConsumer and implements an LTI 1.3 Proctoring Services launch.

Problem

This became an issue in #297.

Because of the inheritance model, it is not possible to have an LTI integration that can both support LTI 1.3 Advantage Services and LTI Proctoring Services, because they're implemented as separate classes. This limits the use of and expansion of this library.

  • For reusable LTI configurations (i.e. those with config_store = CONFIG_EXTERNAL or config_store = CONFIG_ON_DB), this means that a single LTI configuration cannot support both LTI features and that separate configs will need to be created to support the features.
  • For LTI configurations on an XBlock (i.e. those with config_store = CONFIG_ON_XBLOCK), this means that a single XBlock cannot support both LTI features. For the time being, that seems acceptable, but it is a limitation.

Solution

The recommendation is to refactor the LTI consumer classes to share code through composition instead of inheritance. Suggestions from Giovanni on #297 are quoted below.

@MichaelRoytman These lines mean that if lti_1p3_proctoring_enabled is true, then none of the advantage services will be available to that consumer, right? If we're working with a globally re-usable configuration then this adds a constraint: proctoring configs and other types of LTI configs must always be separate.

We don't have a use case for this right now, so this shouldn't block this ticket, but can you add a comment mentioning this new constraint with some context?

I have one idea in mind: instead of using inheritance in the LtiConsumer, we could use composition, and add the services incrementally to the main class only if they are enabled in the configuration.

I did a review pass and left some comments. Something that bothers me a bit is the added complexity that having two different consumer classes is added. Maybe we should rework this and work with composition, only setting up each service if it is defined in the LTI configuration?

I don't think this needs to be added here since it will require some rework, but worth keeping in mind and discussing on a GH issue.

Also, with these complications arising, I'd be in favor of switching from inheritance to composition in the LtiConsumer classes - only configuring the LTI services if they are in the LTI config and throwing errors if the clients try to launch invalid LTI setups.

Missing tool_consumer_instance_guid parameter

The LTI request is missing the tool_consumer_instance_guid param.
This param is not required in the spec but recommended. It is useful in case you have several instances using the same LTI key and secret (dev/production, several campuses...).

Spec:

This is a unique identifier for the TC. A common practice is to use the DNS of the organization or the DNS of the TC instance. If the organization has multiple TC instances, then the best practice is to prefix the domain name with a locally unique identifier for the TC instance. This parameter is recommended.

Remove XBlock `location` dependencies from LTI 1.3 launches

In order to make LTI 1.3 re-usable, we need to remove any dependencies on location from the LTI configuration model and from the launch flow. We currently use location to get contextual information and determine user permissions before an LTI launch.

This is being discussed in a few different places, so I wanted to link them all together here:

We need to:

  • Remove any hard dependencies on location for LTI launches.
  • Add some way for whatever is launching the LTI tool to pass the current context.
  • Implement some form of permission checking for each context/launch type.

The outcomes of this issue

  • An ADR should be written to cover technical details about permission checking in different contexts and how the context will be passed to LtiConfiguration.
  • LTI Consumer should be able to launch LTI content in the given contexts: XBlock, re-usable XBlock configuration, and directly through the Django APIs.
  • ???

Use cases

  • Do content launches through XBlocks
  • Re-using configurations and enabling LTI 1.3 launches from multiple blocks (and keeping context/grades separate)
  • Do content launches outside a block's context (e.g.: LTI course tabs, LTI 1.3 embedded discussions)
  • Do content launches on separate a IDA (e.g.: proctoring)

Notes:
I don't think we need to get everything working and fixed in one PR, this can be divided and worked on in smaller steps.


@tecoholic @Zacharis278 @MichaelRoytman @schenedx @ormsbee

get score from LTI provider

Hi, I have an issue I searched a lot about reading score/grade from LTI to open-edx. but unfortunately, I did not get any answer How to get a score from the LTI provider? is there any other configuration? Anyone help me plz.

open-edx version: Hawthorn
I followed all these solutions:

student.html has no translations; needs expression_filter to be "h"

I just noticed that the student.html template needs an update. Here's the few problems I could spot in:

  • It doesn't specifies <%page expression_filter="h"/>, which is needed for security
  • It makes no use of gettext for translation, which is weird since it does have a ## Translators: ... note
  • It's in Mako and could be converted to Django templates

The lines below are related to the issue:

https://github.com/edx/xblock-lti-consumer/blob/2fd330f8a62bc2fcc4951d546db5ca4920a5c1ec/lti_consumer/templates/html/student.html#L1-L4

extend support for readResult

Hi!

I'm not sure if this is the right place to post issues, or if this code is in production for edx. Nonetheless, I'd like to request support for readResult LTI requests. Currently on replaceResult is supported, this is a major bummer for LTI applications that might wish to read grade results.

Could this supported sometime in the future?

Deep linking target_link_uri and redirect_uri not respected

  • During a deep link launch, the target_link_uri is set to the lti_1p3_launch_url here as opposed to the lti_advantage_deep_linking_launch_url.
    • I believe we want to be using the lti_advantage_deep_linking_launch_url as the target_link_uri during deep link launch since the normal launch url would be associated with a specific resource link (whereas during deep linking there may not even be a resource link created yet).
  • The redirect_uri is then set to the deep linking launch url instead of respecting (and validating) the redirect_uri returned by the Tool.
    • I believe we should respect the Tool provided redirect_uri (assuming it's in our validate redirect uri's list).

redirect_uri should be checked against pre-registered list

Per the lti security framework, the redirect_uri should be validated as a registered redirect uri for the client.

redirect_uri
REQUIRED as per [OPENID-CCORE]. One of the registered redirect URIs.
...
Once the platform has validated the redirect_uri as a valid end point for the client_id, ...

This is also outlined in the OIDC Specification as:

REQUIRED. Redirection URI to which the response will be sent. This URI MUST exactly match one of the Redirection URI values for the Client pre-registered at the OpenID Provider, with the matching performed as described in Section 6.2.1 of [RFC3986] (Simple String Comparison) ...

Currently any redirect uri presented during launch would be honored without verification and the user could be redirected wherever the tool decides (a potential issue if someone decided to be a bad actor as it could be changed on the Tool side without anyone on the edx side being aware).

If key has extra colons(:) in it.

Problem Statement:

We had a situation where we faced an issue with parsing the LTI passports because of extra colons in them.

LTI passports are parsed by using colon(:) as a delimiter. We were trying to integrate LTI content from Rustici controller, the key format of the provider is content:<unit-no>:<id> the LTI passport will like

<content_id>:content:<unit-no>:<provider_id>:<secret_key>

So this results to throwing unformatted Key exception.

PR for the FIX: #70

LTI 1.3 Deep Linking Launch URL missing Custom Parameters

Custom Parameters were added to the repository as part of its 9.6.2 release via #392 , but this seems to have accidentally omitted the required changes from the Deep Linking config url

Line 129 of lti_consumer.api

if deep_linking_enabled:
        launch_data.message_type = "LtiDeepLinkingRequest"
        deep_linking_launch_url = lti_consumer.prepare_preflight_url(
            launch_data,
        )

calls lti_consumer.lti_1p3.consumer.prepare_preflight_url

    def prepare_preflight_url(
            self,
            launch_data,
    ):
        """
        Generates OIDC url with parameters
        """
        user_id = launch_data.external_user_id if launch_data.external_user_id else launch_data.user_id

        # Set the launch_data in the cache. An LTI 1.3 launch involves two "legs" - the third party initiated
        # login request (the preflight request) and the actual launch -, and this information must be shared between
        # the two requests. A simple example is the intended LTI launch message of the LTI launch. This value is
        # known at the time that preflight request is made, but it is not accessible when the tool responds to the
        # preflight request and the platform must craft a launch request. This library stores the launch_data in the
        # cache and includes the cache key as the lti_message_hint query or form parameter to retrieve it later.
        launch_data_key = cache_lti_1p3_launch_data(launch_data)

        oidc_url = self.oidc_url + "?"

        login_hint = user_id

        parameters = {
            "iss": self.iss,
            "client_id": self.client_id,
            "lti_deployment_id": self.deployment_id,
            "target_link_uri": self.launch_url,
            "login_hint": login_hint,
            "lti_message_hint": launch_data_key,
        }

        return oidc_url + urlencode(parameters)

and we can see the custom params are not being included in that parameters dict to be appended to the oidc launch url, which is what Deep Linking generates as part of the first time config/setup portion of the connection

Talking with Michael a little bit it looks like this is boiling down to this claim in the lti1p3 consumer, it is being set explicitly as include_extra_claims=False here

DB configuration gets changed after loading an XBlock in Studio

Steps to reproduce

  1. Go to Studio and create a new LTI Consumer XBlock.
  2. Set the Configuration Type to CONFIG_ON_DB.
  3. Go to the DB configuration of this XBlock and set the configuration to CONFIG_EXTERNAL.
  4. Load the XBlock in Studio.

What happens

The CONFIG_EXTERNAL in the LtiConfiguration DB entry is changed to CONFIG_ON_DB.

What should happen?

The LtiConfiguration DB entry should remain intact.

Originally posted by @Agrendalath in #390 (review)

resource_link_title paremeter missing

The resource_link_title parameter is missing from the LTI request.

Spec:

resource_link_title=My Weekly Wiki
A title for the resource. This is the clickable text that appears in the link. This parameter is recommended.

Challenges in Validating LtiConfiguration Data

Context:

In #260, support for database configuration for LTI 1.3 launches was added; these changes were put behind a CourseWaffleFlag. The CourseWaffleFlag is referenced in the LtiConsumerXBlock and in the LtiConfiguration classes. In the LtiConfiguration's clean method, the waffle flag is used to validate or clean incoming data to prevent the creation of an LtiConfiguration with a config_store value of CONFIG_ON_DB if the waffle flag is not enabled.

The problem is that the CourseWaffleFlag.is_enabled(<course_key>) method can only be called with a course_key. For LtiConfigurations with a None value for the location field, this raises an exception, because the location is used to load in the XBlock from the modulestore. The course_key is then retrieved from the XBlock. Without a location, there is no course_key, and CourseWaffleFlag.is_enabled(<course_key>) fails. This is only observed in the Django admin, because Model.full_clean() and Model.clean() are not called after Model.save().

image

I tried to use a WaffleFlag instead, thinking that I would sacrifice the added customizability of the CourseWaffleFlag in exchange for the ability of the flag to work outside the XBlock context. However, WaffleFlag requires there to be an available request object, and there isn't one when executing a model's clean method.

Problem:

Regardless, I believe that this kind of validation does not belong in the model clean method. This kind of business logic validation should occur before the database layer (e.g. in a Form class or in a DRF serializer). But we are not making use of Form classes or DRF serializers here. I would prefer that this check be in a centralized place within the library; however, I cannot easily determine where this should be because of the various ways LtiConfigurations can be created. The two questions I would like to answer are.

  1. How can we use feature flags throughout the codebase in a way that does not assume an XBlock runtime context?

  2. How can we ensure that the same kind of validation is applied in the three different ways LtiConfigurations can be created?

Requirements:

We want to ensure we do not save invalid data to LtiConfiguration in three places:

1. in the XBlock edit menu in Studio
Currently, this is handled by the xblock_studio_view.js Javascript. There is no backend enforcement.
We could add this waffle flag to validate_field_data.

2. in the Django admin
Currently, whatever is in the LtiConfiguration.clean() method will be run.

3. in any other places where we create LtiConfiguration instances, like the Python API
Currently, there is no validation of the data used to create LtiConfiguration instances when the Python API is used. This is because calling Model.clean() does not call Model.full_clean() and Model.clean().

It may happen that we decide to release "CONFIG_ON_DB" platform wide and remove the feature flag before edx-exams even needs it, in which case the only remaining issue is #2 under "Problems".

Other Notes:

  1. In the Python API, there is no clear place to validate the incoming data. Because of the way that data is synced from the XBlock to the LtiConfiguration in _get_lti_config and _get_or_create_local_lti_config, any time a caller intends to fetch an LtiConfiguration, it may be updated as part of the fetch to sync it with the modulestore. It's unexpected for a function like get_lti_1p3_launch_info to have to handle a ValidationError if we were to add validation to _get_or_create_local_lti_config.

  2. Currently, LtiConfiguration instances are created via the _get_or_create_local_lti_config API method, which is used throughout the codebase. Note that Model.full_clean() and Model.clean() are not actually called when Model.save() is called, so the model clean method only runs when making changes through the Django admin. With support for CONFIG_ON_DB, we need to support creating and editing LtiConfigurations from the Django admin while also being able to validate incoming data. It's possible to customize the model admin form to store the request and use a WaffleFlag, but it's hacky.

`user_id` not set and send as expected

Hello 👋,

I am running the tutor distribution of Open Edx (https://docs.tutor.overhang.io/) and trying to add an LTI component.

I followed these steps: https://edx.readthedocs.io/projects/edx-partner-course-staff/en/latest/exercises_tools/lti_component.html#enabling-and-using-lti-advantage-features

For the LTI component I want to pass the edx username to the LTI component. So here https://my-edx.de/admin/xblock_config/courseeditltifieldsenabledflag/ I added the respective course, which gives me the opportunity to set the flag "Request user's username" to True in my course.

Following your guide to test LTI (https://github.com/edx/xblock-lti-consumer#lti-11) I always get user_id=student and not the current edx user. Am I still missing something here? What do I have to configure to get the current user passed on to my LTI component?

Thank you!

Using Assignment and Grades throws an error on Dark language middleware

Release: Maple
Relevant settings: ENABLE_COMPREHENSIVE_THEMING=True
LTI consumer settings:

  • LTI version: 1.3
  • Enable LTI NRPS: True
  • Deep linking: True
  • LTI Assignment and Grades Service: Programmatic
  • Everything else: True

We have configured an LTI 1.3 producer using PyLTI1p3 library (using the Django example).

The connection works fine, however when the tool uses the Assignment and Grades service or Names and Roles service, every time it sends a requests to our Maple instance, it triggers a 500 due to missing user at https://github.com/edx/edx-platform/blob/4f49475b94a61c3290c1813f5272277b7986b06c/openedx/core/djangoapps/dark_lang/middleware.py#L159

The error is:

AttributeError
'NoneType' object has no attribute 'is_authenticated'
openedx/core/djangoapps/dark_lang/middleware.py in _activate_preview_language at line 165
openedx/core/djangoapps/dark_lang/middleware.py in process_response at line 102
django/utils/deprecation.py in __call__ at line 119
django/core/handlers/exception.py in inner at line 47

CI error on master - PyPi deployment fails

I think I've found out why the CI is failing on master: the pypi deploy job is running twice (once at the end of each job - quality and tests), and it'll always fail at the end of one since the version is already there.

It doesn't break anything, but it makes the CI on master red and should be fixed (low priority).

CC @sarina @nedbat

Remove xblockutils package

Parent ticket: openedx/axim-engineering#915

This repository is using the xblock-utils package which has been deprecated and migrated to within the xblock package.

In this ticket, we will:

Tasks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.