GithubHelp home page GithubHelp logo

geontech / meta-redhawk-sdr Goto Github PK

View Code? Open in Web Editor NEW
9.0 9.0 6.0 490 KB

REDHAWK SDR Layer for Yocto/OpenEmbedded -based deployments

Home Page: http://geontech.com/getting-started-with-meta-redhawk-sdr/

License: GNU Lesser General Public License v3.0

Shell 12.34% BitBake 58.72% Python 22.58% C++ 0.63% SourcePawn 0.41% PHP 3.45% Assembly 1.85%
yocto openembedded-layer redhawk

meta-redhawk-sdr's Introduction

meta-REDHAWK-SDR

Meta-REDHAWK-SDR is an actively-maintained set of Yocto/Open-Embedded recipes for the REDHAWK SDR framework, its dependencies, GPP, other example Devices, all shared libraries (softpkg), and all but 1 CPP Component (DataConvert requires SSE).

NOTE: The most recent version of Yocto tested with this layer is Rocko, 2.4.1.

This repository, along with the base Yocto framework will enable you to build the REDHAWK SDR framework for any hardware platform in which a Board Support Package is available. We at Geon have successfully used this layer on a variety of Zynq targets including:

Note: Only the presently-tagged version is supported by this repository, however we can support some earlier versions if required. Please contact us through our website if this is a need.

Is my hardware supported?

The Yocto website provides a list of Official BSPs which include common hardware platforms like the Raspberry Pi, BeagleBoard, BeagleBone, NUC, Intel Atom, etc.

There are plenty of BSPs floating around for other hardware platforms so do some searching before you write your own.

Use the Pyro branch/version as your starting point for searching since it is the most recent version of Yocto that Geon has actively tested this layer against.

Getting Started

Learning By Example

The most straight-forward installation is to use Google's repo and a manifest. Complete instructions and our manifest can be found here. The instructions amount to a handful of terminal commands that setup a known-good build as a starting point.

Alternatively, you can clone this layer into your own Yocto source tree:

cd <your work director where other meta* are loaded>
git clone git://github.com/geontech/meta-redhawk-sdr.git

Then edit your build/conf/bblayers.conf to include a reference to meta-redhawk-sdr at the end of the list. See our meta-redhawk-sdr/conf/bblayers.conf.sample as an example.

Note: The processor architecture is set to armv7l by default. If you need a different processor architecture, set the REDHAWK_PROCESSOR variable in your build/conf/local.conf.

Finishing the Build

In the contrib/scripts folder is the build-image.sh script, a derivative of a script from Philip Balister (@balister) of Ettus Research who included it with their meta-sdr. The script uses wic to build a single image file that can be directly copied to an SD card (dd) resulting in the appropriate partitions, etc., based on the associated wks file.

To use it, link this script into your build directory, set it to executable, and specify the BUILD_IMAGE and MACHINE environment variables (e.g., qemuarm and redhawk-gpp-image). Then running this script will go through the whole bitbake process for you, automated.

SPD Patching

The scripts/spd_utility has two purposes. The first purpose is to take a cpp Component, SoftPkg, or Device that has a defined x86_64 implementation and create a new implementation matching cpp-REDHAWK_PROCESSOR. The output directory, etc. are all changed to match so that the output product will overwrite the original implementation back at the Domain controller, i.e., the second purpose. The second purpose is to take the output product's SPD and merge it with the Domain's SDRROOT. Together with collecting the ComponentHost package, the Domain will be able to distribute your assets in a multi-architecture Domain. See scripts/spd_utility --help for arguments.

NOTE: If you would like to disable this functionality for your package, include NO_SPD_PATCH = "1" in the recipe.

Additional Resources

Yocto Mega Manual

Bitbake cheatsheet

meta-redhawk-sdr's People

Contributors

ajung123 avatar btgoodwin avatar patcamwol avatar rodrigo455 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

meta-redhawk-sdr's Issues

Errror while running build-images.sh

i am running the command in the build directory
bash ../meta-redhawk-sdr/contrib/scripts/build-image.sh
and i got the error:
`MACHINE: ettus-e3xx-sg1
BUILD_IMAGE: redhawk-usrp-uhd-image
Meta-REDHAWK-SDR Root found: ../meta-redhawk-sdr
STEP 1 - Starting Bit Bake
Loading cache: 100% |###########################################| ETA: 00:00:00
Loaded 2281 entries from dependency cache.
Parsing recipes: 100% |#########################################| Time: 00:00:00
Parsing of 1772 .bb files complete (1771 cached, 1 parsed). 2281 targets, 108 skipped, 0 masked, 0 errors.
WARNING: No bb files matched BBFILE_PATTERN_ettus-e100 '^/home/ekko/worspace/REDHAWK/oe-core/../meta-ettus/e100-bsp/'
NOTE: Resolving any missing task queue dependencies
NOTE: multiple providers are available for runtime u-boot-xlnx-dev (u-boot-xlnx-dev, u-boot-xlnx)
NOTE: consider defining a PREFERRED_PROVIDER entry to match u-boot-xlnx-dev
NOTE: multiple providers are available for jpeg (jpeg, libjpeg-turbo)
NOTE: consider defining a PREFERRED_PROVIDER entry to match jpeg
ERROR: Nothing PROVIDES 'xerces-c-native' (but virtual:native:/home/ekko/worspace/REDHAWK/oe-core/../meta-redhawk-sdr/recipes-deps/xsd/xsd_4.0.0.bb DEPENDS on or otherwise requires it). Close matches:
re2c-native
xrefresh-native
xsd-native
NOTE: Runtime target 'usrp-uhd' is unbuildable, removing...
Missing or unbuildable dependency chain was: ['usrp-uhd', 'bulkiointerfaces', 'redhawk', 'xsd-native', 'xerces-c-native']
ERROR: Required build target 'redhawk-usrp-uhd-image' has no buildable providers.
Missing or unbuildable dependency chain was: ['redhawk-usrp-uhd-image', 'usrp-uhd', 'bulkiointerfaces', 'redhawk', 'xsd-native', 'xerces-c-native']

`
i am a newer with yocto, how to fix this bb file,thank you

Addeding meta-redhawk-sdr to yocto build

Hi,

I have fallowed the steps for development toolkit from "BSP-Yocto-TISDK-AM57xx-PD18.1.0 Quickstart - PHYTEC Wiki ---> https://wiki.phytec.com/productinfo/phycore-am57x/bsp-yocto-tisdk-am57xx/bsp-yocto-tisdk-am57xx-pd18-2-0-quickstart/bsp-yocto-tisdk-am57xx-pd18-1-0-quickstart" and yocto build is working well.

I have target to add new meta layer "meta-redhawk-sdr" package to existing meta layers.
So, i have cloned redhawk package from 'geotech' git hub and try to build.

I added new meta layer in 'build/conf/bblayers.conf'
:${BSPDIR}/sources/meta-redhawk-sdr \

The build as completed but i didn't find any process included from new package, as per new package (attachment - README.md) process called "Domain and Device" manager task should be seen in process list.

But i didn't find those.

Can anyone help me out in this, how to bring up the process in build and should automatically boot when board is powered on.

The log after the board power on attached above.
Attachments.zip

Regards,
Nayana

Running GPP in an embedded target

While running a GPP device, I get the output:

shell-init: error retrieving current directory: getcwd: cannot access parent directories: Success
ps: invalid option -- 'o'
BusyBox v1.23.1 (2017-05-19 16:57:08 UTC) multi-call binary.

Usage: ps

it seems that ps from busybox doesn't have options like -e or -o.
I don't know if its an issue worth solving with a patch. I was able to run a SigGen component using GPP-arm.

scripts/spd_utility patch_sdrroot() not saving updated spd.xml file

@btgoodwin : This code path in spd_utility patch_sdrroot (dunfell-next branch):

if not t_code_ref:

does not export the updated file. So when I run the install_assets script, if the component already exists in $SDRROOT, the cross-compiled component binary is installed, but the spd.xml file does not show the additional cross-compiled implementation.

Is this a bug? i.e., Doesn't the domain manager need to know about the cross-compiled binary via an implementation entry in the spd.xml file?

ERROR : nodeBooter - Unable to launch DeviceManager softpkg: No matching implementation found.

Hi,

I was trying to include meta-redhawk-sdr layer(Branch : krogoth) to Yocto build and build was successful.

After loading the image to hardware, found OmniNames and OminEvents task on brackground, after running the script "node-DevMgr-<MACHINE_NMAE>-ALL" manullay.

But Unable to launch Device/Domain manager with daemon "nodeBooter -d/D <*.dcd.xml>". Observed below error:
ERROR : nodeBooter - Unable to launch DeviceManager softpkg: No matching implementation found.

Can anyone suggest us is any packages needs to install/include during the build.

Regards,
Nayana

Patch all bare 'python' references to python2

Throughout the core framework, 'python' is used and is creating issues now that python2 is deprecated and yocto has pushed support into a separate feature layer. As such the -native execution of some of these scripts causes issues as you would expect, as would they on the target platform if the 'python' link was not pointed at python2. The scope of this issue is to patch those references where found.

How to communicate between two devices which has RedhawkSDR images loaded on them.

Hi,

I have loaded yocto images contains meta-redhawk-sdr on two Phytec Hardware. now i want to know how to communicate with them. As i see in Manual from RedhawkSDR, socket_loopback_demo is based on sinksocket and sourcesocket.

Can any one explain how that waveform is working between two devices for data transmission.
Is that possible to use those component for communication without launching the waveform.
If possible, can you guide me to do the same?

If that waveform is only for data transmission in one device, can you give us the route to how to transmit the data between two devices.

From, other issue comment(#66)....as suggested need to build Bridge component, can you explain us what is that meant and how it will help for data communication.

Regards,
Nayana

Remove dynamic patch functions from do_patch

As a general statement, tasks should be re-runable in Yocto. Some of our patch tasks patch and then patch over those patches dynamically, which means the original patch no longer apply at all (so it's not reversible). This causes extra pain when trying to develop patch sets against an asset because one cannot simply roll forwards and backwards in the patch history (in devshell using quilt, for example), because our do_patch[postfuncs] have sullied the environment.

More information: we currently use do_patch[postfuncs] in a few places to first tee-up things like BB_REDHAWK_PROCESSOR and then globally clobber it in the source tree with a sed expression. We have to do the same thing against Makefile.am for over half a dozen different things that are part of the core framework templates of existing assets. These kinds of patches should be moved to places like do_install[prefuncs], do_configure[prefuncs], etc. -- wherever it feels more appropriate. As always these functions should result in a no-operation if they've already applied their changes.

Missing or unbuildable dependency chain was: ['redhawk-gpp-image', 'gpp-node', 'redhawk-native', 'python-threading-native']

Hi,

I have YOCTO build setup as per the link below :
https://wiki.phytec.com/productinfo/phycore-am57x/bsp-yocto-tisdk-am57xx/bsp-yocto-tisdk-am57xx-pd18-2-0-quickstart/bsp-yocto-tisdk-am57xx-pd17-1-1-quickstart

As per the above setup, the build is on morty.

Target :
Want to include "meta-redhawk-sdr" to source.

I have fallowed steps to add meta-redhawk layer from below link :
https://github.com/Geontech/meta-redhawk-sdr

Branch :
I have tried rocko, thud and warrior. I have seen same ERROR in all the branch as mentioned.

ERROR Log:
cyient@INBE1E-DWL292AV:/opt/PHYTEC_BSPs/yocto_ti/build$ bitbake redhawk-gpp-image
Loading cache: 100% |########################################################################################################################################################################| ETA: 00:00:00
Loaded 2937 entries from dependency cache.
WARNING: No recipes available for:
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-arago/meta-arago-distro/recipes-security/optee/optee-os_git.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-processor-sdk/recipes-core/udev/udev_182.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/meta-phytec-ti/recipes-support/cpuset/cpuset_1.5.6.bbappend
WARNING: No bb files matched BBFILE_PATTERN_phytec '^/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/common/'
NOTE: Resolving any missing task queue dependencies
ERROR: Nothing RPROVIDES 'python-threading-native' (but virtual:native:/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-core/core-framework/redhawk_2.2.3.bb RDEPENDS on or otherwise requires it)
NOTE: Runtime target 'python-threading-native' is unbuildable, removing...
Missing or unbuildable dependency chain was: ['python-threading-native']
NOTE: Runtime target 'gpp-node' is unbuildable, removing...
Missing or unbuildable dependency chain was: ['gpp-node', 'redhawk-native', 'python-threading-native']
ERROR: Required build target 'redhawk-gpp-image' has no buildable providers.
Missing or unbuildable dependency chain was: ['redhawk-gpp-image', 'gpp-node', 'redhawk-native', 'python-threading-native']

Summary: There were 2 WARNING messages shown.
Summary: There were 2 ERROR messages shown, returning a non-zero exit code.

Even i tried executing "bitbake redhawk-test-image", the ERROR is same.

For Branch :
pyro and krogoth, i have facing library issue as below.

ERROR LOG:
cyient@INBE1E-DWL292AV:/opt/PHYTEC_BSPs/yocto_ti/build$ bitbake redhawk-gpp-image
Loading cache: 100% |########################################################################################################################################################################| ETA: 00:00:00
Loaded 2936 entries from dependency cache.
WARNING: No recipes available for:
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-arago/meta-arago-distro/recipes-security/optee/optee-os_git.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-processor-sdk/recipes-core/udev/udev_182.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/meta-phytec-ti/recipes-support/cpuset/cpuset_1.5.6.bbappend
WARNING: No bb files matched BBFILE_PATTERN_phytec '^/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/common/'
NOTE: Resolving any missing task queue dependencies

Build Configuration:
BB_VERSION = "1.30.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "Ubuntu-14.04"
TARGET_SYS = "arm-linux-gnueabi"
MACHINE = "am57xx-phycore-rdk"
DISTRO = "arago"
DISTRO_VERSION = "2016.10"
TUNE_FEATURES = "arm armv7a vfp thumb neon callconvention-hard"
TARGET_FPU = "hard"
meta-processor-sdk = "morty:3678d672c9c47a646897286281d4260ac4ace960"
meta-arago-distro
meta-arago-extras = "HEAD:76381bbc10b93ffc6c014154814872b170b93796"
meta-qt5 = "HEAD:2b1871f0d139dc3caaa779a32a1931409c245a36"
meta-networking
meta-python
meta-ruby
meta-oe = "morty:851a064b53dca3b14dd33eaaaca9573b1a36bf0e"
meta-ti = "HEAD:d8aa76970a0ba48762c631dfd79dbed49222373b"
meta-linaro-toolchain = "HEAD:9b616ce6d4293387d254f36800389b2910895420"
meta = "HEAD:28da89a20b70f2bf0c85da6e8af5d94a3b7d76c9"
meta-phytec
meta-phytec-ti = "HEAD:086a659eb42b85ca68d21f4bee8eb51cb9ea3019"
meta-redhawk-sdr = "pyro:06ff489191194292aed6386407602049e07ec887"
meta-filesystems = "morty:851a064b53dca3b14dd33eaaaca9573b1a36bf0e"

NOTE: Preparing RunQueue
WARNING: /opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/meta-phytec-ti/recipes-kernel/linux/linux-phytec-ti_4.4.bb.do_compile is tainted from a forced run
WARNING: /opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/meta-phytec-ti/recipes-bsp/u-boot/u-boot-phytec_2016.05.bb.do_compile is tainted from a forced run
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
ERROR: redhawk-2.0.6-r2 do_compile: oe_runmake failed
ERROR: redhawk-2.0.6-r2 do_compile: Function failed: do_compile (log file is located at /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/temp/log.do_compile.11636)
ERROR: Logfile of failure stored in: /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/temp/log.do_compile.11636
Log data follows:
| DEBUG: SITE files ['endian-little', 'bit-32', 'arm-common', 'arm-32', 'common-linux', 'common-glibc', 'arm-linux', 'arm-linux-gnueabi', 'common']
| DEBUG: Executing shell function do_compile
| NOTE: make -j 4
| Making all in acinclude
| make[1]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/acinclude' | make[1]: Nothing to be done for all'.
| make[1]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/acinclude' | Making all in etc | make[1]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/etc'
| make[1]: Nothing to be done for all'. | make[1]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/etc'
| Making all in base
| make[1]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base' | Making all in framework/idl | make[2]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base/framework/idl'
| make all-am
| make[3]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base/framework/idl' | ../../../arm-linux-gnueabi-libtool --tag=CXX --mode=link arm-linux-gnueabihf-g++ -march=armv7-a -marm -mfpu=neon -mfloat-abi=hard --sysroot=/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/am57xx-phycore-rdk -Wall -Wno-switch -D__arm__ -D__linux__ -D__OSVERSION__=2 -D__arm__ -D__linux__ -D__OSVERSION__=2 -D__arm__ -D__linux__ -D__OSVERSION__=2 -D__arm__ -D__linux__ -D__OSVERSION__=2 -I../../include -isystem/opt/PHYTEC_BSPs/gcc-linaro-5.3-2016.02-x86_64_arm-linux-gnueabihf/arm-linux-gnueabihf/include -O2 -pipe -g -feliminate-unused-debug-types -fdebug-prefix-map=/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2=/usr/src/debug/redhawk/2.0.6-r2 -fdebug-prefix-map=/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/x86_64-linux= -fdebug-prefix-map=/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/am57xx-phycore-rdk= -fpermissive -std=gnu++98 -Wall -version-info 4:0:0 -L/opt/PHYTEC_BSPs/gcc-linaro-5.3-2016.02-x86_64_arm-linux-gnueabihf/arm-linux-gnueabihf/lib -Wl,-rpath-link,/opt/PHYTEC_BSPs/gcc-linaro-5.3-2016.02-x86_64_arm-linux-gnueabihf/arm-linux-gnueabihf/lib -Wl,-O1 -Wl,--hash-style=gnu -o libossieidl.la -rpath /usr/redhawk-sdr/core/lib libossieidl_la-cfSK.lo libossieidl_la-cfDynSK.lo libossieidl_la-DataTypeSK.lo libossieidl_la-DataTypeDynSK.lo libossieidl_la-PortSK.lo libossieidl_la-PortDynSK.lo libossieidl_la-PortTypesSK.lo libossieidl_la-PortTypesDynSK.lo libossieidl_la-StandardEventSK.lo libossieidl_la-StandardEventDynSK.lo libossieidl_la-AggregateDevicesSK.lo libossieidl_la-AggregateDevicesDynSK.lo libossieidl_la-ExtendedEventSK.lo libossieidl_la-ExtendedEventDynSK.lo libossieidl_la-QueryablePortSK.lo libossieidl_la-QueryablePortDynSK.lo libossieidl_la-WellKnownPropertiesSK.lo libossieidl_la-WellKnownPropertiesDynSK.lo libossieidl_la-sandboxSK.lo libossieidl_la-sandboxDynSK.lo libossieidl_la-LogInterfacesSK.lo libossieidl_la-LogInterfacesDynSK.lo libossieidl_la-EventChannelManagerSK.lo libossieidl_la-EventChannelManagerDynSK.lo -lomniORB4 -lomnithread -lomnithread -lomniDynamic4 -lomniORB4 -lomnithread -luuid -llog4cxx | /bin/grep: /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/am57xx-phycore-rdk/home/tcwg-buildslave/workspace/tcwg-make-release/label/tcwg-x86_64-ex40/target/arm-linux-gnueabihf/_build/builds/destdir/x86_64-unknown-linux-gnu/arm-linux-gnueabihf/lib/libstdc++.la: No such file or directory | sed: can't read /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/am57xx-phycore-rdk/home/tcwg-buildslave/workspace/tcwg-make-release/label/tcwg-x86_64-ex40/target/arm-linux-gnueabihf/_build/builds/destdir/x86_64-unknown-linux-gnu/arm-linux-gnueabihf/lib/libstdc++.la: No such file or directory | arm-linux-gnueabi-libtool: error: '/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/am57xx-phycore-rdk/home/tcwg-buildslave/workspace/tcwg-make-release/label/tcwg-x86_64-ex40/target/arm-linux-gnueabihf/_build/builds/destdir/x86_64-unknown-linux-gnu/arm-linux-gnueabihf/lib/libstdc++.la' is not a valid libtool archive | make[3]: *** [libossieidl.la] Error 1 | make[3]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base/framework/idl'
| make[2]: *** [all] Error 2
| make[2]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base/framework/idl' | make[1]: *** [all-recursive] Error 1 | make[1]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/git/redhawk-core-framework/redhawk/src/base'
| make: *** [all-recursive] Error 1
| ERROR: oe_runmake failed
| ERROR: Function failed: do_compile (log file is located at /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/armv7ahf-neon-linux-gnueabi/redhawk/2.0.6-r2/temp/log.do_compile.11636)
ERROR: Task 2187 (/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-core-framework/redhawk/redhawk_2.0.6.bb, do_compile) failed with exit code '1'
NOTE: Tasks Summary: Attempted 2208 tasks of which 2204 didn't need to be rerun and 1 failed.
Waiting for 0 running tasks to finish:

Summary: 1 task failed:
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-core-framework/redhawk/redhawk_2.0.6.bb, do_compile
Summary: There were 4 WARNING messages shown.
Summary: There were 2 ERROR messages shown, returning a non-zero exit code.

Can anyone help me, which branch of Readhawk is matches with yocto-morty branch and solution for above issues.

Purpose:
Want to bring up the Device and Domain manager of Readhawk when we flash images generated from bitbake.

Regards,
Nayana

redhawk-waveform class and EXPORT_FUNCTIONS is illegal

Since the class name has a hyphen, we cannot export the do_install task for users to override. This applies to any release.

Fixes include:

  1. Eliminate the EXPORT_FUNCTIONS from redhawk-waveform.bbclass since really there's only one way to install a waveform.
  2. Rename redhawk-waveform.bbclass to remove the hyphen (e.g., redhawk_waveform.bbclass), which makes it "odd" compared to the other class names. So then, should we rename them all?

qpycore_chimera.cpp: has no member named 'qt4_flags'

When I run the following:
bitbake python-pyqt
It thrown the following:

qpycore_chimera.cpp: In member function 'void Chimera::set_flag()':
qpycore_chimera.cpp:552:50: error: 'pyqt4ClassTypeDef {aka struct _pyqt4ClassTypeDef}' has no member named 'qt4_flags'
_is_flag = ((pyqt4ClassTypeDef _)type)->qt4_flags & 0x01;
^
make[2]: *
* [qpycore_chimera.o] Error 1
make[2]: Leaving directory /buildarea/raid0/ddu/poky-krogoth-15.0.0/build-beaglebone2/tmp/work/cortexa8hf-neon-poky-linux-gnueabi/python-pyqt/4.9.6-r1/PyQt-x11-gpl-4.9.6/qpy/QtCore' make[1]: *** [sub-QtCore-make_default] Error 2 make[1]: Leaving directory/buildarea/raid0/ddu/poky-krogoth-15.0.0/build-beaglebone2/tmp/work/cortexa8hf-neon-poky-linux-gnueabi/python-pyqt/4.9.6-r1/PyQt-x11-gpl-4.9.6/qpy'
make: *** [sub-qpy-make_default] Error 2

ERROR: (virtual:native:/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-deps/omniorb/omniorb_4.2.3.bb, do_compile) failed with exit code '1'

Hi
I am trying to build arago-core-tisdk-image,
I followed the document to build is: https://wiki.phytec.com/display/PRODUCTINFO/BSP-Yocto-TISDK-AM57xx-PD17.1.0+Quickstart
But while running 'bitbake arago-core-tisdk-image' command , i am getting below error.

Additional info:
python version : 2.7

Error:

MACHINE=am57xx-phycore-rdk bitbake arago-core-tisdk-image
Loading cache: 100% |###################################################################################################################################################| ETA: 00:00:00
Loaded 2938 entries from dependency cache.
WARNING: No recipes available for:
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-arago/meta-arago-distro/recipes-security/optee/optee-os_git.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-processor-sdk/recipes-core/udev/udev_182.bbappend
/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/meta-phytec-ti/recipes-support/cpuset/cpuset_1.5.6.bbappend
WARNING: No bb files matched BBFILE_PATTERN_phytec '^/opt/PHYTEC_BSPs/yocto_ti/sources/meta-phytec/common/'
NOTE: Resolving any missing task queue dependencies
Build Configuration:
BB_VERSION = "1.30.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "Ubuntu-14.04"
TARGET_SYS = "arm-linux-gnueabi"
MACHINE = "am57xx-phycore-rdk"
DISTRO = "arago"
DISTRO_VERSION = "2016.10"
TUNE_FEATURES = "arm armv7a vfp thumb neon callconvention-hard"
TARGET_FPU = "hard"
meta-processor-sdk = "HEAD:3678d672c9c47a646897286281d4260ac4ace960"
meta-arago-distro
meta-arago-extras = "HEAD:76381bbc10b93ffc6c014154814872b170b93796"
meta-qt5 = "HEAD:2b1871f0d139dc3caaa779a32a1931409c245a36"
meta-networking
meta-python
meta-ruby
meta-oe
meta-filesystems = "HEAD:851a064b53dca3b14dd33eaaaca9573b1a36bf0e"
meta-ti = "HEAD:d8aa76970a0ba48762c631dfd79dbed49222373b"
meta-linaro-toolchain = "HEAD:9b616ce6d4293387d254f36800389b2910895420"
meta = "HEAD:28da89a20b70f2bf0c85da6e8af5d94a3b7d76c9"
meta-phytec
meta-phytec-ti = "HEAD:086a659eb42b85ca68d21f4bee8eb51cb9ea3019"
meta-redhawk-sdr = "thud:2e3f30e1af435b995ed47a71837b79ef2fe7144c"
NOTE: Preparing RunQueue
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
ERROR: omniorb-native-4.2.3-r0 do_compile: oe_runmake failed
ERROR: omniorb-native-4.2.3-r0 do_compile: Function failed: do_compile (log file is located at /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/temp/log.do_compile.28416)
ERROR: Logfile of failure stored in: /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/temp/log.do_compile.28416
Log data follows:
| DEBUG: Executing shell function do_compile
| NOTE: make -j 8
| making export in ./src...
| make[1]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src' | making export in src/tool... | make[2]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool'
| making export in src/tool/omkdepend...
| make[3]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omkdepend' | File omkdepend hasn't changed. | make[3]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omkdepend'
| making export in src/tool/omniidl...
| make[3]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl' | making export in src/tool/omniidl/cxx... | Traceback (most recent call last): | File "<string>", line 1, in <module> | File "/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/x86_64-linux/usr/lib/python2.7/distutils/sysconfig.py", line 22, in <module> | PREFIX = os.path.normpath(sys.prefix).replace( os.getenv("BUILD_SYS"), os.getenv("HOST_SYS") ) | TypeError: expected a string or other character buffer object | Traceback (most recent call last): | File "<string>", line 1, in <module> | File "/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/x86_64-linux/usr/lib/python2.7/distutils/sysconfig.py", line 22, in <module> | PREFIX = os.path.normpath(sys.prefix).replace( os.getenv("BUILD_SYS"), os.getenv("HOST_SYS") ) | TypeError: expected a string or other character buffer object | make[4]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl/cxx'
| making export in src/tool/omniidl/cxx/cccp...
| make[5]: Entering directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl/cxx/cccp' | File omnicpp hasn't changed. | make[5]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl/cxx/cccp'
| g++ -c -isystem/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/x86_64-linux/usr/include -O2 -pipe -Wall -Wno-unused -fexceptions -isystem/opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/sysroots/x86_64-linux/usr/include -I -DPYTHON_INCLUDE="<Python.h>" -DPYTHON_THREAD_INC="<pythread.h>" -DIDLMODULE_VERSION=""0x2630"" -fPIC -I. -I../../../../../omniORB-4.2.3/src/tool/omniidl/cxx -I../../../../include -I../../../../../omniORB-4.2.3/include -D__OSVERSION__=2 -D__linux__ -D__x86_64__ -o idlpython.o ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:31:12: error: #include expects "FILENAME" or
| # include PYTHON_INCLUDE
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:47:6: error: #error "omniidl requires Python 1.5.2 or higher"
| # error "omniidl requires Python 1.5.2 or higher"
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:99:15: error: 'PyObject' does not name a type
.....
.....
....
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc: In function 'void init_omniidl()':
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1532:5: error: 'PyObject' was not declared in this scope
| PyObject* m = Py_InitModule((char*)"_omniidl", omniidl_methods);
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1532:15: error: 'm' was not declared in this scope
| PyObject* m = Py_InitModule((char*)"_omniidl", omniidl_methods);
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1532:52: error: 'omniidl_methods' was not declared in this scope
| PyObject* m = Py_InitModule((char*)"_omniidl", omniidl_methods);
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1532:67: error: 'Py_InitModule' was not declared in this scope
| PyObject* m = Py_InitModule((char*)"_omniidl", omniidl_methods);
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1534:44: error: 'PyString_FromString' was not declared in this scope
| PyString_FromString(IDLMODULE_VERSION));
| ^
| ../../../../../omniORB-4.2.3/src/tool/omniidl/cxx/idlpython.cc:1534:45: error: 'PyObject_SetAttrString' was not declared in this scope
| PyString_FromString(IDLMODULE_VERSION));
| ^
| make[4]: *** [idlpython.o] Error 1
| make[4]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl/cxx' | make[3]: *** [export] Error 1 | make[3]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool/omniidl'
| make[2]: *** [export] Error 1
| make[2]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src/tool' | make[1]: *** [export] Error 1 | make[1]: Leaving directory /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/build/src'
| make: *** [all] Error 1
| ERROR: oe_runmake failed
| WARNING: /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/temp/run.do_compile.7946:1 exit 1 from 'exit 1'
| ERROR: Function failed: do_compile (log file is located at /opt/PHYTEC_BSPs/yocto_ti/build/arago-tmp-external-linaro-toolchain/work/x86_64-linux/omniorb-native/4.2.3-r0/temp/log.do_compile.7946)
ERROR: Task 7984 (virtual:native:/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-deps/omniorb/omniorb_4.2.3.bb, do_compile) failed with exit code '1'
NOTE: Tasks Summary: Attempted 2090 tasks of which 2064 didn't need to be rerun and 1 failed.
NOTE: Tasks Summary: Attempted 2090 tasks of which 2064 didn't need to be rerun and 1 failed.
Waiting for 0 running tasks to finish:
Summary: 1 task failed:
virtual:native:/opt/PHYTEC_BSPs/yocto_ti/sources/meta-redhawk-sdr/recipes-deps/omniorb/omniorb_4.2.3.bb, do_compile

What mistake done from my side..?
Please help me to resolve above issue...

Thanks
shravan

spd_utility skips updating "other" dependency types

It looks like upstream, REDHAWK uses other instead of runtime_requirements (at least that's what the IDE is enforcing). If dependencies aren't patched to also identify the cpp-PACKAGE_ARCH location, the autoconf macros will not be able to crawl from one SPD to the next to find the package config files, resulting in a package config error.

This is a few-character fix on line 114 of spd_utility:

        if dep.get_type() in ['other', 'runtime_requirements']:

Interacting between Components which is running on two different Hardware

Hi all,

I am trying to interact between 2 Hardware which are loaded with Yocto image(built with meta-redhawk-sdr layer).
OmniNames, OmniEvents, Domain and Device managers are able to run on Hardware.

Requirement :
On each Hardware launched Redhawk components using sandbox and is there any ways to interact between those components which is running on different Hardware?

Regards,
Nayana

export PYTHONPATH in do_configure_prepend

I was working with a custom REDHAWK Component with a redhawk softpkg dependency.
while bitbaking the component... the do_configure step faces the configure.ac line:

RH_SOFTPKG_CXX([/deps/mylibrary/mylibrary.spd.xml])

which will check for SDRROOT and run the python script 'redhawk-softpkg'.

In order to get through this step I had to add to my recipe:
CACHED_CONFIGUREVARS += "ossie_cv_sdr_root=${SDRROOT_STAGED}"
(extending CACHED_CONFIGUREVARS of 'redhawk-oeconf.bbclass' to find the spd.xml file)
do_configure_prepend() {
export BUILD_SYS=${BUILD_SYS}
export HOST_SYS=${HOST_SYS}
export STAGING_INCDIR=${STAGING_INCDIR}
export STAGING_LIBDIR=${STAGING_LIBDIR}
export PKG_CONFIG_PATH="${OSSIEHOME_STAGED}/lib/pkgconfig:${PKG_CONFIG_PATH}"
export PYTHONPATH=${OSSIEHOME_STAGED}/lib/python:${PYTHONPATH}
}
(extendind the PYTHONPATH in do_configure_prepend to import ossie.parsers in 'redhawk-softpkg')

Error while building redhawk-bulkio_2.0.4

log.do_compile.txt
Hello,

I'm trying to bitbake for usrp e310... I'm getting this error of bad instruction while building redhawk-bulkio:

...
{standard input}:365797: Error: bad instruction incl [r3,#4]' | {standard input}:365893: Error: bad instruction lock'
| {standard input}:365894: Error: bad instruction incl [r2,#4]' | make[1]: *** [cpp/libbulkio_2.0_la-bulkio_in_port.lo] Error 1 | make[1]: Leaving directory /home/fmce794/e300-oe-build/build/tmp-glibc/work/armv7ahf-vfp-neon-oe-linux-gnueabi/redhawk-bulkio/2.0.4-r0/git/bulkioInterfaces/libsrc'
| make: *** [all-recursive] Error 1
| ERROR: oe_runmake failed
| WARNING: /home/fmce794/e300-oe-build/build/tmp-glibc/work/armv7ahf-vfp-neon-oe-linux-gnueabi/redhawk-bulkio/2.0.4-r0/temp/run.do_compile.8753:1 exit 1 from
| exit 1
| ERROR: Function failed: do_compile (log file is located at /home/fmce794/e300-oe-build/build/tmp-glibc/work/armv7ahf-vfp-neon-oe-linux-gnueabi/redhawk-bulkio/2.0.4-r0/temp/log.do_compile.8753)
ERROR: Task 1521 (/home/fmce794/e300-oe-build/oe-core/../meta-redhawk-sdr/recipes-redhawk/redhawk-bulkio/redhawk-bulkio_2.0.4.bb, do_compile) failed with exit code '1'

any ideas why this is happening? a do_compile log is attached. thanks!

Failed to satisfy device dependencies for component - Waveform loading on Taget board

Hi,

I have the meta-redhawk-sdr(Branch : Pyro) with Yocto BSP am57xx and loaded image on Target board.

Loaded image has all the Redhawk components and basic Waveform.
Now trying to launch one of the available basic waveform from the image loaded on Taget board(Phytec).

Procedure or Fallowed steps:

Target board connected to Desktop server via serial cable and VLAN also available.
Started Domain and Device manager with naming services on Target board.
Connected to sandbox from python script as shown below :

[redhawk@2a23bdd8bb78 ~]$ python
Python 2.6.6 (r266:84292, Aug 18 2016, 15:13:37)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-17)] on linux2
Type "help", "copyright", "credits" or "license" for more information.

        from ossie.utils import redhawk,sb

        import time

        dom = redhawk.attach("REDHAWK_DEV")

        app = createApplication("/waveforms/rh/basic_components_demo/basic_components_demo.sad.xml")
        Traceback (most recent call last):

File "", line 1, in
NameError: name 'createApplication' is not defined

        app = dom.createApplication("/waveforms/rh/basic_components_demo/basic_components_demo.sad.xml")
        Traceback (most recent call last):

File "", line 1, in
File "/usr/local/redhawk/core/lib/python/ossie/utils/redhawk/core.py", line 1794, in createApplication
app = app_factory.create(name, initConfiguration, [])
File "/usr/local/redhawk/core/lib/python/ossie/cf/cf_idl.py", line 2026, in create
return _omnipy.invoke(self, "create", _0_CF.ApplicationFactory._d_create, args)

ossie.cf.CF.CreateApplicationError: CF.ApplicationFactory.CreateApplicationError(errorNumber=CF_ENOSPC, msg="Failed to satisfy device dependencies for component: 'rh.SigGen' with component id: 'SigGen_sine:rh.basic_components_demo_339_120418450_1'")

Understanding from source:

As seen in the source or *.spd.xml file, observed PROCESSOR name is playing major role with respect to Target boards(cross compiled OS).

My requirement for Target board is MACHINE_NAME=am57xx-phycore-rdk and PROCESSOR=armv7l

I have mentioned both variables in conf/local.conf.

But few of the components only showing armv7l implementation like dsp, rbdsdecoder , device and domain manager spd.xml files.

All other components are with armv7ahf-neon processor implementation.

Can any one suggest how to change processor name from armv7ahf-neon to armv7l during the building image

If my observations are wrong as per above Error: can you suggest the solution for Failed to satisfy device dependencies for component.

Regards,
Nayana

sse2neon - better machine support

The COMPATIBLE_MACHINE = "(arm|aarch64)" usage in this recipe will not work for devices like the Ettus E310 since those values aren't in the MACHINEOVERRIDES. I'm sure this extends to other machines as well. I'm open to other suggestions, but this would be more flexible since in those cases, OVERRIDES will almost certainly have arm or aarch64 in the list (otherwise the dependency chain for rh.DataConverter would not have reached over to those patches and dependencies lists):

COMPATIBLE_MACHINE_arm = "${MACHINE}"
COMPATIBLE_MACHINE_aarch64 = "${MACHINE}"

We would need to apply this as far back as rocko.

Required processor_name value in GPP.prf.xml file

I came up with this trying to launch a waveform for distributed systems targeting arm and x86.
GPP.prf.xml currentlly shows:

  <simple id="DCE:fefb9c66-d14a-438d-ad59-2cfd1adb272b" mode="readonly" name="processor_name" type="string">
    <description>SCA required property describing the CPU type</description>
    <kind kindtype="property"/>
    <kind kindtype="configure"/>
    <kind kindtype="allocation"/>
    <action type="eq"/>
  </simple>

there... should have a tag <value>armv7l</value> . I don't know if gpp_setup script should deal with this. Maybe the other properties need the same attention

How to interact with Redhawk Device Manager on AM57xx EVM phytec board

My requirement :
Pinging Hardware(AM57xx EVM board) through Device manager which is running on hardware from any Desktop server.

Test setup:
Yocto BSP build image is completed with meta Redhawk layer(Branch:Pyro) and image is loaded on the hardware.Now, on the hardware seen Device and Domain managers are running background tasks. VLAN cables are connected to hardware and local desktop server within same network.

I am trying to interact with redhawk device manager which is running on AM57xx EVM board, from one of my desktop by using ping utility.

But on the hardware didn't observe any notification that shows hardware is responding back to ping request.

Is there any ways to find out is our ping request reaching hardware via Redhawk device manager.

If yes, can you guide me the way to find out the place where i can find out the interaction interface between hardware(Device manager) and local server.

But It could not print anything via redhawk Device manager on board.

Redhawk branch : pyro Build : yocto

Can anyone let me know how to interact with redhawk device manager from any other desktop/laptop?

Thanks,
Nayana

GPP gpp_setup patch in 2.2.3 is unnecessary

The PR against the core-framework has been incorporated in a way that lets you pass --processorname with whatever value you want (along with a whole host of other changes...looks like a re-write). For rocko and thud 2.2.3, the trust-uname-m.patch will fail.

Ettus e310 sg3 firmware

In this article there’s a link to a firmware for the e310. I’ve been able to flash an e310 and have it connect to the RedHawk 2.2.5 domain manager. However, I have no tuners listed under the running e310.

I’m beginning to wonder if the firmware was built for the sg1 model, would that maybe be an issue when trying to use an sg3 model?

zeus-next GPP crashes on startup, ARM 32bit

Layers:

zeus      poky/meta
          poky/meta-poky
          poky/meta-yocto-bsp
zeus      meta-openembedded/meta-oe
          meta-openembedded/meta-python
          meta-openembedded/meta-filesystems
          meta-openembedded/meta-networking
zeus      meta-python2
zeus-next meta-redhawk-sdr

Image: redhawk-test-image

Error output:

2020-07-07 18:03:34 INFO  DeviceManagerLoader:457 - Starting Device Manager with /nodes/DevMgr-gpp/DeviceManager.dcd.xml
2020-07-07 18:03:34 INFO  DeviceManager:493 - adding in property for :DCE:fefb9c66-d14a-438d-ad59-2cfd1adb272b value : armv7vet2hf-neon
2020-07-07 18:03:34 INFO  DeviceManager:497 - adding in property for :DCE:4a23ad60-0b25-4121-a630-68803a498f75 value : Linux
2020-07-07 18:03:34 INFO  DeviceManager:664 - Connecting to Domain Manager REDHAWK_DEV/REDHAWK_DEV
2020-07-07 18:03:34 INFO  DeviceManager:1044 - Placing Component CompId: DevMgr-gpp:GPP ProfileName : GPP
2020-07-07 18:03:34 INFO  DeviceManagerLoader:511 - Starting ORB!
2020-07-07 18:03:35 INFO  GPP.system.Device:1188 - DEV-ID:DevMgr-gpp:GPP Requesting IDM CHANNEL IDM_Channel
2020-07-07 18:03:35 INFO  GPP.system.EventManager:606 - PUBLISHER - Channel:IDM_Channel Reg-Idf881e3c7-ebce-4d83-9717-bce322eeb312 RESOURCE:DevMgr-gpp:GPP
2020-07-07 18:03:35 INFO  DeviceManager:1596 - Registering device GPP device id DevMgr-gpp:GPP on Device Manager DevMgr-gpp
2020-07-07 18:03:35 INFO  DeviceManager:1636 - Device LABEL: GPP  SPD loaded: GPP' - 'DCE:4e20362c-4442-4656-af6d-aedaaf13b275
2020-07-07 18:03:35 INFO  GPP:1073 - initialize()
2020-07-07 18:03:35 INFO  GPP.system.EventManager:664 - SUBSCRIBER - Channel:ODM_Channel Reg-Id002ba541-1c97-4093-b8ec-6b72a35b3a35 resource:DevMgr-gpp:GPP
2020-07-07 18:03:35 INFO  GPP:1094 - Component Output Redirection is DISABLED.
2020-07-07 18:03:35 INFO  GPP:2241 - Affinity Disable State,  disabled=1
2020-07-07 18:03:35 INFO  GPP:2243 - Disabling affinity processing requests.
2020-07-07 18:03:35 INFO  GPP:858 -  initialize CPU Monitor --- wl size 0
terminate called without an active exception
2020-07-07 18:03:38 ERROR DeviceManager:1700 - The following CORBA exception occurred: COMM_FAILURE while attempting to initialize Device GPP. Device registration with Device Manager failed
2020-07-07 18:03:38 WARN  DeviceManager:2913 - Child process GPP (pid 821) has terminated with signal 6

My guess is that somewhere along the line, the mechanisms under the hood for probing the underlying Linux system (coreutils, etc.) have been updated to a point that there is breakage with the typical busybox counterparts that get installed. This particular error doesn't happen on qemuarm64, however, so it's unclear if this is a busybox problem if it's resolvable by replacing a tool (like eliminating the ps patch by setting an RDEPENDS against procps).

omniorb switch source from http to https

I tried to bake the recipe for omniorb_4.2.0.bb
It wasn't possible to fetch the source because it was forbidden.
In a bbapend file I switched the sources to https and now it works.
SRC_URI = "https://downloads.sourceforge.net/omniorb/omniORB-4.2.0.tar.bz2;name=omniORB420tarbz2" SRC_URI_virtclass-native = "https://downloads.sourceforge.net/omniorb/omniORB-4.2.0.tar.bz2;name=omniORB420tarbz2"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.