GithubHelp home page GithubHelp logo

microsoft / energy-data-services-experience-lab Goto Github PK

View Code? Open in Web Editor NEW
12.0 9.0 4.0 927 KB

Experience Lab is a set of utilities that assist in creating instances of Microsoft Azure Data Manager for Energy, performing data loads, and performing basic management operations.

License: MIT License

Dockerfile 0.48% Shell 23.92% Bicep 7.05% JavaScript 0.72% TypeScript 43.56% CSS 4.68% HTML 18.39% SCSS 1.20%

energy-data-services-experience-lab's Introduction

Experience Lab - Microsoft Azure Data Manager for Energy

Build Status

CI

About

Experience Lab is an automated, end-to-end deployment accelerator for Microsoft Azure Data Manager for Energy that provides easy, fast deployments with sample dataset(s) for learning, testing, demo and training purposes for customers and partners.

Experience Lab makes it easy to create a Developer Tier instance of Azure Data Manager for Energy. It includes a simple web UI to perform basic management tasks like the creation of users and legal tags, and performing data load of standard, sample data sets. It also includes integration with Power BI Desktop to perform data validation and visualization of loaded data.

It therefore allows even the audience that is not deeply technical to be able to create fully configured, data-loaded instances of Azure Data Manager for Energy quickly, with ease.

Experience Lab is only recommended for non-production use cases. It is open source, so that our customers and partners can freely use it and extend it to their bespoke use cases, including automation of deploying their own applications with Azure Data Manager for Energy.

Components

Installing and running Experience Lab

Requirements

  • Experience Lab must be deployed in a region currently supported by Azure Data Manager for Energy.
  • The default installation script for the Experience Lab control plane requires the following privileges:
    • Owner, Service Administrator, Co-Administrator, or Contributor + User Access Administrator at the subscription level.
    • Permission to register an application with your Azure AD tenant.

Create Control Plane Via ARM Template

Use the button below to deploy the Experience Lab Control Plane to your Azure Subscription. Further instructions are in /control-plane

Deploy to Azure

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

energy-data-services-experience-lab's People

Contributors

danieljamescarpenter avatar danielscholl avatar dependabot[bot] avatar microsoft-github-operations[bot] avatar microsoftopensource avatar miller-kyle avatar nur858 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

energy-data-services-experience-lab's Issues

Need for fully parameterizing experiencelab.sh

experiencelab.sh:

I have converted the CreateLab deployment to Bicep for automation, naming and standardization purposes, but noticed limitations.

Could you please replace or remove the local parameter ITEM_NAME and parameterized all the resource names so that name naming convention can be implemented by choice. Resource that aren't parameterized and uses ITEM_NAME are: Key vault, Storage account, Container registry, Manage identity, Log Analytics, template, App Service plan, App Service, Meds instance (Platform) and FIRST_LEGAL_TAG_NAME. The DATA_PARTITION parameter should also be an array.

There is also a need to have a parameter for the data platform resource group since this is most likely created in another resource group already.

Missing bulk data?

We’ve had a chance to try the latest MEDS release and are seeing a similar problem with the bulk DDMS data missing as we had seen when we tried the Oak Forest preview release.

We also loaded the TNO data using our site.

https://experiencelab7284.azurewebsites.net/dataload

by Deploying the “open-test-data” spec in Azure.

We see lots of wells, wellbores, and well logs loaded. When we make a request to DDMS to get the well log header, that works fine.

GET …../api/os-wellbore-ddms/ddms/v3/welllogs/'logId'

When we ask for the bulk data

GET …../api/os-wellbore-ddms/ddms/v3/welllogs/'logId'/data

We get an error.

{
'detail': 'bulk for record platform7284-opendes:work-product-component--WellLog:000a9129ac8b4880af1c42a097b458bb not found'
}

As an attempt at fixing this, we tried populating the bulk data ourselves from the las files used to create the well log header. We tried sending both a parquet data file and a json Pandas file to the POST bulk data API. The first errors that we got back were like this.

{'detail':'Column(s) DT, NPHI, RHOB, GR, DEPT do(es) not match any CurveID of the WellLog record.'}

The well log headers don’t have any CurveID values, so it looks like that prevents the bulk data API from working.

I forced an updated of the CurveID values (set them equal to the mnemonic) on a couple Well Log records to see if that helps. Then tried updating the bulk data again. It now takes about 80 seconds for the request to come back, but I still get a 500 error back.

{'detail':'Unexpected error and save bulk'}

I tried both parquet and the json Pandas formats and get the same errors.

Is the intention that the TNO data load should have loaded the bulk data? If not, do you know why us trying the populate it ourselves may have failed?

Thanks,
Mike

Unable to load data into ADME instance by using TNO Data load Container instance which created as part of the Experience Lab Bicep template

Hi Team,

Thank you for your contribution to provide us the Bicep and ARM scripts to create the Experience Lab, ADME instance and other resources. When we ran the TNO data load ACI to load data into the ADME instance its not working and seeing the errors, but when i create TNO data load by using the other repo https://github.com/Azure/osdu-data-load-tno and run it, it works. Please let us know which one we have to load the data into ADME. Thank you.

Regards,
Sreedhar

Deployments not working

Deployments of ExperienceLab are currently broken, with the following error message while running the deploy container (generated from the CreateLab template).

=================================================================================================================
2023-03-01T21:02Z Creating Data Platform
=================================================================================================================

  Be Patient ~1 hour wait...
ERROR: the following arguments are required: --properties/-p

Examples from AI knowledge base:
az resource create --resource-group myRG --name myWeb --resource-type Microsoft.web/sites --properties "{ \"serverFarmId\":\"/subscriptions/{SubID}/resourcegroups/ {ResourceGroup}/providers/Microsoft.Web/serverfarms/{ServicePlan}\" }"
Create a web app with the minimum required configuration information.

az group create --location westus --resource-group MyResourceGroup
Create a new resource group in the West US region.

az group create --location westeurope --resource-group MyResourceGroup --tags {tags}
Create a new resource group. (autogenerated)

https://docs.microsoft.com/en-US/cli/azure/group#az_group_create
Read more about the command in reference docs
main: line 568: --properties: command not found

I have tested changing the references from Microsoft.OpenEnergyPlatform to Microsoft.MicrosoftEnergyDataServices within the experiencelab.sh file (on the Storage Account) before running the CreateLab (deploy container instance), but it did not change the error above.

Hi @Miller-Kyle, Thank you for the update. I see one more issue that while i am running the template spec to upload the data to ADME instance getting the access denied error. Please help me how to mitigate this issue.

          Hi @Miller-Kyle, Thank you for the update. I see one more issue that while i am running the template spec to upload the data to ADME instance getting the access denied error. Please help me how to mitigate this issue.

2023-07-05 06:06:20,558 [Dataload ] [INFO ] args: Namespace(dir='/app/open-test-data/datasets/documents', output='output/loaded-documents-datasets.json', subparser='datasets')
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Amount of data to transfer: 26.22MB
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Request worker count: 50
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Max Chunk size: 32 MB
2023-07-05 06:06:20,654 [Dataload ] [INFO ] Total number of files to upload: 9
2023-07-05 06:06:20,870 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FP.pdf with response <Response [403]>
2023-07-05 06:06:20,884 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/osduka_demo_texts.txt with response <Response [403]>
2023-07-05 06:06:20,977 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/prov22.pdf with response <Response [403]>
2023-07-05 06:06:21,108 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FS12-3031.pdf with response <Response [403]>
2023-07-05 06:06:21,119 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FI.pdf with response <Response [403]>
2023-07-05 06:06:21,121 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/69_D_CH_11.pdf with response <Response [403]>
2023-07-05 06:06:21,124 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/prov33.pdf with response <Response [403]>
2023-07-05 06:06:21,149 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FS12-3003.pdf with response <Response [403]>
2023-07-05 06:06:21,539 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/69_D_CH_13.pdf with response <Response [403]>
2023-07-05 06:06:21,674 [Dataload ] [INFO ] File location map is saved to output/loaded-documents-datasets.json
2023-07-05 06:06:21,690 [Dataload ] [INFO ] Files that could not be uploaded: 9
2023-07-05 06:06:21,690 [Dataload ] [INFO ] ['/app/open-test-data/datasets/documents/FP.pdf',
'/app/open-test-data/datasets/documents/osduka_demo_texts.txt',
'/app/open-test-data/datasets/documents/prov22.pdf',
'/app/open-test-data/datasets/documents/FS12-3031.pdf',
'/app/open-test-data/datasets/documents/FI.pdf',

Originally posted by @sreedharguda in #34 (comment)

while i am running the template spec to upload the data to ADME instance getting the access denied error.

While i am running the template spec to upload the data to ADME instance getting the access denied error. Please help me how to mitigate this issue.

2023-07-05 06:06:20,558 [Dataload ] [INFO ] args: Namespace(dir='/app/open-test-data/datasets/documents', output='output/loaded-documents-datasets.json', subparser='datasets')
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Amount of data to transfer: 26.22MB
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Request worker count: 50
2023-07-05 06:06:20,614 [Dataload ] [INFO ] Max Chunk size: 32 MB
2023-07-05 06:06:20,654 [Dataload ] [INFO ] Total number of files to upload: 9
2023-07-05 06:06:20,870 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FP.pdf with response <Response [403]>
2023-07-05 06:06:20,884 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/osduka_demo_texts.txt with response <Response [403]>
2023-07-05 06:06:20,977 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/prov22.pdf with response <Response [403]>
2023-07-05 06:06:21,108 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FS12-3031.pdf with response <Response [403]>
2023-07-05 06:06:21,119 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FI.pdf with response <Response [403]>
2023-07-05 06:06:21,121 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/69_D_CH_11.pdf with response <Response [403]>
2023-07-05 06:06:21,124 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/prov33.pdf with response <Response [403]>
2023-07-05 06:06:21,149 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/FS12-3003.pdf with response <Response [403]>
2023-07-05 06:06:21,539 [Dataload ] [ERROR ] /files/uploadURL failed for /app/open-test-data/datasets/documents/69_D_CH_13.pdf with response <Response [403]>
2023-07-05 06:06:21,674 [Dataload ] [INFO ] File location map is saved to output/loaded-documents-datasets.json
2023-07-05 06:06:21,690 [Dataload ] [INFO ] Files that could not be uploaded: 9
2023-07-05 06:06:21,690 [Dataload ] [INFO ] ['/app/open-test-data/datasets/documents/FP.pdf',
'/app/open-test-data/datasets/documents/osduka_demo_texts.txt',
'/app/open-test-data/datasets/documents/prov22.pdf',
'/app/open-test-data/datasets/documents/FS12-3031.pdf',
'/app/open-test-data/datasets/documents/FI.pdf',

Originally posted by @sreedharguda in #34 (comment)

tnoDataLoad

Looks like there is a broken link to the dataset

image

Cannot create Experience Lab (tried Europe West and North)

I am using the Deploy to Azure button with Arm template. Had three tries that always ends with same error. I am using the SET Subcription that is whitelisted.

Getting error on dataLoad deployment.
 "code": "InvalidValuesForRequestParameters",
        "message": "Values for request parameters are invalid: signedExpiry."

See also attached image.
dataload

Jon Olav Abeland
[email protected]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.