Comments (17)
@wajda Do you think this issue of RDD transformations not having metadata could be potentially solvable through Spline eventually?
More specifically: I see that the entry point for the lineage data is through the QueryExecutionListeners, which is why the RDD transformations aren't getting picked up. Do you know of any ways/think it is possible for Spline to track RDD transformations through some other means?
from spline-spark-agent.
End of the year - I'd say very likely.
Or, of course if anybody from the community could help us with that it could be released earlier.
from spline-spark-agent.
Hi, can you please add some script reproducin issue ?
We need to know how you read and write data, we are not interested in operations like filter, aggregations...
from spline-spark-agent.
Are you using RDD or DataFrame? Spline can only track Spark transformations defined using Spark SQL / DataFrame API. Unfortunately RDD transformations are untrackable due to lack of meta information.
from spline-spark-agent.
Yes, we brainstormed this question a while ago, a few possible partial solution popped out. But they were either fragile or cumbersome to use, and neither was solving the problem completely.
We could capture a physical lineage, but the goal is to capture logical one, so we need to keep as close to a user source code as possible. We thought we could use AspectJ to intercept the RDD methods calls and capture the logical plan that way. But again, most of the RDD methods accept lambdas, and in JVM we don't know what is inside lambdas. This is the biggest issue. Perhaps the best solution would be the one that approaches the problem from both sides - utilizes a static source code analysis, and then link the static metadata with the dynamic metadata collected on runtime. IMO that sounds doable in theory, but quite complex in practice. But we never say never :)
from spline-spark-agent.
In the event that Spline can't capture this lineage information automatically, we were considering options to augment the lineage data more or less manually. Ideally Spline has full visibility to all actions and can report them automatically, but where the operations are opaque or simply not captured at the sufficient detail in Spark's plans, business requirements would dictate that we still need to capture the actions somehow.
Do you think there would be any possibility in Spline's architecture to essentially inject user-defined lineage events into Spline? For context, we are calling DataFrame.foreachPartition
and executing JDBC calls for each row here, so we would like to note that in the lineage stream, and could call a Spline method when we execute a DataFrame or RDD operation which Spline does not support.
I can see us having to have some kind of sidecar in any case to have complete visibility over lineage information, so if it can be part of Spline itself, that can at least contain all the lineage information in a single graph. If that doesn't make sense architecturally though, we can simply have two sets of lineage data and stitch them together outside of Spline.
from spline-spark-agent.
Yes, it makes perfect sense. Thanks for the great idea. Need to think about it. So, basically, the way it could be done is something like this:
val df0: DataFrame = ... // my initial DataFrame
val rdd0 = df0.rdd // switch to an RDD level. Automatic lineage is lost at this point!
val rdd1 = rdd0.map(myCoolRDDTransformFunction) // do some stuff on RDD
val df1 = rdd1.toDF() // get back to DataFrame.
// From the business perspective df1 is derived from df0 via myCoolRDDTransformFunction, so...
df1.derivesLineageFrom(df0).as(spline.CustomOperation("my-cool-rdd-transform", ... /* other relevant info */ ))
I wonder if this approach could also work for tracking ML lineages.
from spline-spark-agent.
Hi Wajda,
we are also facing issue with rdd, actually above idea is good. can we expext it in next release please , we want to use spline into our business model.
from spline-spark-agent.
Hi Wajda,
We are experiecing issue with generating lineage while there is a rdd logic in code. Can we expect fix for it in next version of Spline ?
from spline-spark-agent.
We'll see. Not in 0.5, but maybe in one of the nearest future releases.
from spline-spark-agent.
Hi Wajda,
Can you please let us know if RDD lineage support is planned for release in near time. We are intergrating Spline in our business Model and we have rdd's in our code.
Thanks,
Nikhith
from spline-spark-agent.
We'll try to include it in 0.6, aiming for the release at the beginning of June.
from spline-spark-agent.
@wajda Do you still feel that Spline is still on track to support user-defined RDD lineage or any kind of RDD lineage within the 0.6 release or before the end of the year?
from spline-spark-agent.
@ashyamala, it's still on the road map, the demand for this feature is growing. But apparently it won't fit into 0.6, as we've got plenty of stuff to do in 0.6 already.
from spline-spark-agent.
It didn't make it to 0.6. It doesn't seem to be a priority so far...
from spline-spark-agent.
Hello, I'm just wondering what the state of this issue is? Is it actively in progress?
from spline-spark-agent.
No, it's not a priority at the moment, but is still on the road map.
from spline-spark-agent.
Related Issues (20)
- Spline agent affecting databricks driver performance HOT 7
- Datahub integration reference link HOT 1
- Getting error while connecting tryng to use Kafka dispatcher HOT 1
- Initialization failure handling control HOT 2
- Cant track org.apache.hadoop.fs.rename
- use splineAgent as a Jar file HOT 12
- Spline Agent required for spark 3.4 version and Scala 2.12 HOT 2
- Facing issues while sending Databricks Lineage (Spline) to kafka HOT 1
- Support for spark 3.5 HOT 1
- Does spline solution support snowpark
- Support of setting the arangoDB name on the configuration HOT 7
- Need to deploy on Synapse environment in order to track lineage for Synapse notebooks HOT 1
- Unity Catalog Support to get the Notebook details
- Support for AWS Glue DynamicFrame HOT 2
- Error when running spark application with agent-core as queryExecutionListeners in java 17
- Spline Support of Statment Create table as select HOT 11
- Using Spline in Spark Scala project HOT 1
- Can Spline support lineage for AWS glue Spark dynamic frames? HOT 1
- Spline doesn't track some in memory operations in pyspark HOT 5
- Customized Listener HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spline-spark-agent.