microsoft / gr-azure Goto Github PK
View Code? Open in Web Editor NEWLicense: GNU General Public License v3.0
License: GNU General Public License v3.0
testing the issue workflow to ADO
Python block's docs look fine but the C++ blocks (currently just the 2 DIFI blocks) don't have their docstrings showing up in GRC in the documentation tab. Seems like the docs are in the right spot in the header file, using the proper doxygen tags, so might be a cmake issue? Both the dev VM and my local install demonstrate this issue.
Fixes AB#8347
Some users may want to cache blob file contents if they intend to re-run the same processing multiple times without re-downloading the same file over and over.
It may be useful to include some pointers in the documentation on how to handle these use cases, possibly by using something like a Lustre file system and file source blocks if users want to flexibly work with a large number of blobs, or using AzCopy to manually copy smaller numbers of blobs and again using file source blocks.
https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
Based on the docs it sounds like setting the stream number to -1 is supposed to make it accept all stream numbers, and there's code in the impl to reflect that, but you can't actually set it to -1 because it's a uint, you'll get a TypeError
Background:
During integration testing, errors came up from "integration_blob_common.py" concerning blob permissions:
azure.core.exceptions.HttpResponseError: This request is not authorized to perform this operation using this permission.
The current documentation recommends using the command "az login" (az cli) to prevent this issue, but after multiple tries, it was discovered the test was using the "Managed Identity", which didn't have any permissions to the Storage Account. From reviewing the documentation, the behavior is consistent with the order defined here.
Workaround:
As part of our test, we found out that granting the VM (managed identity) access to the Storage account and the Keyvault allowed the integration tests to pass and finish.
Next steps:
The development group needs to issue a recommendation on credentials. Most of the content mentions "az cli" , but given that we have also asked them to assign a managed identity during the VM creation (picture below), the auth order will not allow the "Az cli" credentials to take effect.
On the other hand, we can include additional comments to remind the user to grant access to the Manage Identity, and delete references to "az cli"
Can you please tell me where these blocks are, the ADS-B Framer and other ADS blocks?
https://github.com/microsoft/gr-azure/blob/main/docs/images/example_flowgraph.png.
In addion, this RM deployment link does not work, issue with CORS being enabled.
https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2Fmicrosoft%2Fazure-software-radio%2Fdev%2Fgr-azure-software-radio%2Fexamples%2Fblob_example_resources.json
Thank you!
I cloned the repository in the Azure Development VM, and I tried to build the OOT modules but I'm getting this error after running cmake ..
:
-- The CXX compiler identification is GNU 9.3.0
-- The C compiler identification is GNU 9.3.0
-- Check for working CXX compiler: /bin/c++
-- Check for working CXX compiler: /bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /bin/cc
-- Check for working C compiler: /bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Build type not specified: defaulting to release.
-- Found PkgConfig: /bin/pkg-config (found version "0.29.1")
-- Found LOG4CPP: /usr/lib/x86_64-linux-gnu/liblog4cpp.so
-- Checking for module 'gmp'
-- Found gmp, version 6.2.0
-- Found GMP: /usr/lib/x86_64-linux-gnu/libgmpxx.so
-- Using GMP.
-- Found MPLIB: /usr/lib/x86_64-linux-gnu/libgmpxx.so
-- Found Boost: /lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake (found suitable version "1.71.0", minimum required is "1.71.0") found components: date_time program_options filesystem system regex thread unit_test_framework
-- Found Volk: Volk::volk
-- User set python executable /usr/bin/python3
-- Found PythonInterp: /usr/bin/python3 (found version "3.8.10")
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so (found suitable exact version "3.8.10")
-- Found Git: /bin/git
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so
-- Using install prefix: /usr/local
-- Building for version: v1.0-compat-xxx-xunknown / 1.0.0git
-- No C++ unit tests... skipping
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
-- PYTHON and GRC components are enabled
-- Python checking for pygccxml - not found
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- LTO enabled
-- Configuring done
CMake Error in lib/CMakeLists.txt:
Imported target "gnuradio::gnuradio-runtime" includes non-existent path
"/include"
in its INTERFACE_INCLUDE_DIRECTORIES. Possible reasons include:
* The path was deleted, renamed, or moved to another location.
* An install or uninstall procedure did not complete successfully.
* The installation package was faulty and references files it does not
provide.
CMake Error in lib/CMakeLists.txt:
Imported target "gnuradio::gnuradio-runtime" includes non-existent path
"/include"
in its INTERFACE_INCLUDE_DIRECTORIES. Possible reasons include:
* The path was deleted, renamed, or moved to another location.
* An install or uninstall procedure did not complete successfully.
* The installation package was faulty and references files it does not
provide.
CMake Error in python/bindings/CMakeLists.txt:
Imported target "Boost::date_time" includes non-existent path
"/include"
in its INTERFACE_INCLUDE_DIRECTORIES. Possible reasons include:
* The path was deleted, renamed, or moved to another location.
* An install or uninstall procedure did not complete successfully.
* The installation package was faulty and references files it does not
provide.
CMake Error in python/bindings/CMakeLists.txt:
Imported target "Boost::date_time" includes non-existent path
"/include"
in its INTERFACE_INCLUDE_DIRECTORIES. Possible reasons include:
* The path was deleted, renamed, or moved to another location.
* An install or uninstall procedure did not complete successfully.
* The installation package was faulty and references files it does not
provide.
-- Generating done
CMake Generate step failed. Build files cannot be regenerated correctly.
I would like to be able to build the OOT modules with the default VM image. Am I doing something wrong?
test-signal.dat file not creating on sink blob. I configured all these thing whatever you guys suggested. also using CLI authenticate.
Following
Quickstart: Key Vault with Role Based Access Controls and Azure CLI Credentials.
Quickstart: Running the Blob Source and Sink blocks with az login.
I am following the README in gr-azure-software-radio
on a fresh Ubuntu 20.04 VM.
Here is the output of cmake ..
:
-- The CXX compiler identification is GNU 9.3.0
-- The C compiler identification is GNU 9.3.0
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Build type not specified: defaulting to release.
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.29.1")
-- Found LOG4CPP: /usr/lib/x86_64-linux-gnu/liblog4cpp.so
-- Checking for module 'gmp'
-- Found gmp, version 6.2.0
-- Found GMP: /usr/lib/x86_64-linux-gnu/libgmpxx.so
-- Using GMP.
-- Found MPLIB: /usr/lib/x86_64-linux-gnu/libgmpxx.so
-- Found Boost: /usr/lib/x86_64-linux-gnu/cmake/Boost-1.71.0/BoostConfig.cmake (found suitable version "1.71.0", minimum required is "1.71.0") found components: date_time program_options filesystem system regex thread unit_test_framework
-- Found VOLK: Volk::volk
-- User set python executable /usr/bin/python3
-- Found PythonInterp: /usr/bin/python3 (found version "3.8.10")
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so (found suitable exact version "3.8.10")
-- Found Git: /usr/bin/git
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
-- Found PythonLibs: /usr/lib/x86_64-linux-gnu/libpython3.8.so
-- Using install prefix: /usr/local
-- Building for version: v1.0-compat-xxx-xunknown / 1.0.0git
-- No C++ unit tests... skipping
-- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
-- PYTHON and GRC components are enabled
-- Python checking for pygccxml - not found
-- Performing Test HAS_FLTO
-- Performing Test HAS_FLTO - Success
-- LTO enabled
-- Configuring done
-- Generating done
-- Build files have been written to: /home/gustavo/code/azure-software-radio/gr-azure-software-radio/build
And here is the output of make
:
Scanning dependencies of target gnuradio-azure_software_radio
[ 8%] Building CXX object lib/CMakeFiles/gnuradio-azure_software_radio.dir/difi_source_cpp_impl.cc.o
[ 16%] Building CXX object lib/CMakeFiles/gnuradio-azure_software_radio.dir/difi_sink_cpp_impl.cc.o
make[2]: *** No rule to make target '/usr/lib/x86_64-linux-gnu/liborc-0.4.so', needed by 'lib/libgnuradio-azure_software_radio.so.v1.0-compat-xxx-xunknown'. Stop.
make[1]: *** [CMakeFiles/Makefile2:251: lib/CMakeFiles/gnuradio-azure_software_radio.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
Then I googled around a bit and found that this means that I'm missing liborc-0.4.so
. I was able to fix this with sudo apt install liborc-0.4-dev
, then I was able to install the modules and use them in gnuradio-companion
, but is this the right thing to do? If so, I think the README should be updated to call out this dependency on liborc.
Depending on the environment used to run cmake when building OOT modules in the Developer VM, python files may be installed in different directories.
The build process should produce consistent results regardless of if a user is running CMAKE from a direct SSH terminal in the Developer VM or in a terminal inside a remote desktop session.
In the Blob Sink yaml here https://github.com/microsoft/azure-software-radio/blob/main/gr-azure-software-radio/grc/azure_software_radio_blob_sink.block.yml#L29 I'm not sure we are dealing with i-shorts correctly, it shows it as using int32 datatype, but i-shorts usually use int16 and then you just have to keep track separately that they are interleaved IQ instead of just reals. That's why when you read in a file that is i-shorts, you read it in as real shorts but then you have to use the i-short-to-complex block. No one really uses int32's themselves (and I don't know of any blocks that use them), so we just need to make sure we support complex float32, float32, int16, complex int16s, uint8's, and maybe complex uint8s. That will cover like 99.99% of use-cases. When you give the Blob Sink the data, there's no difference between complex and real for the int16s and uint8s (chars), you would save them to a binary file the same way, although now that we have SigMF support there is a reason to keep track, perhaps with a "is complex" block param, so that the datatype in the SigMF meta file is accurate.
I think we have to change it to use int16 for the "sc16" type, but then we also need to add a way to distinguish between sc16 and normal int16 other than just using dtype which will now be the same for both, so perhaps a boolean is_complex or something similar that the user can check if they have i-shorts being fed into the block, that way the SigMF side knows how to label it.
There are important files that Microsoft projects should all have that are not present in this repository. A pull request has been opened to add the missing file(s). When the pr is merged this issue will be closed automatically.
Microsoft teams can learn more about this effort and share feedback within the open source guidance available internally.
As I've been using the blob blocks, mainly the source block, I have to sort of manually figure out what max rates I can achieve, to be able to stream samples at 100% rate without delays mixed in. When I'm on a VM in Azure I think the rate is limited by the bandwidth between the VM and blob storage, which is much higher if the blob storage account is in a different region than the VM, which for some applications could break the flowgraph just from moving from West to East for example. Standard vs premium storage account may also impact it. So one idea is a block with no inputs or outputs that you can plop into a flowgraph and hit run, that runs a simple test to see how fast it's able to read and write from blob storage, like UHD's benchmark_rate tool. It could also just be a cmd line utility that comes with the OOT, not a block, but if it's a block then people who don't read any docs will still see it =). I don't know if there's a utility built into Azure already to do this that we can just call and spit out the results to the console, but even if it has to be done manually on the Source side it shouldn't be too hard, you just point to an example IQ file in blob storage and read it into a flowgraph with no throttle, into a null sink or something, then probe rate. We could even set up storage accounts on all the major regions and then in the block params, have a drop-down param to choose the region to use for the speed test, that way they don't need a storage account already set up to run the test, and there are no block params to fiddle with to get it to run. Just an idea that seems like it might be useful to folks who want to use blob storage for real applications.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.