Exhibit HN: Blockchain Spark – Decode fine contract reveal into DataFrame

85
Exhibit HN: Blockchain Spark – Decode fine contract reveal into DataFrame

Auto test

Blockchain Spark capacity that you can decode fine contract reveal into DataFrame.

Instance notebook the reveal of the decoded blockchain recordsdata: OpenSea Shopping and selling Metrics.

Using this Repo

Building

We reveal Maven for constructing Java UDFs current by blockchain-spark.

To compile, bustle assessments, and score jars:

Set up blockchain-spark:

Running

To bustle a spark-shell with blockchain-spark and its dependencies on the classpath:

pyspark --jars target/blockchain-spark-$VERSION-SNAPSHOT-jar-with-dependencies.jar

Running Checks

To bustle Python assessments:

export SPARK_HOME=location of local Spark installation>
python3 setup.py nosetests

Utilization

Prerequisites

Sooner than you start parsing events, or now not it is most fundamental to retailer logs and traces somewhere Apache Spark can entry them
(incessantly an object retailer).

We suggest the reveal of Ethereum ETL to export them.

Quickstart

Context initialization

spark3 = Spark3(spark.sparkSession)

Get the decoded feature calls and events of a sure contract as Dataframe

contract = spark3.contract(address=..., abi=...)
feature = contract.get_function_by_name("function_name")
event = contract.get_event_by_name("event_name")

Repeat the first 100 lines of feature calls

contract.get_function_by_name('atomicMatch_').filter('dt="2022-01-01"').show(100)

Combination on events

“2022-01-01”
community by dt
converse by dt
“””).show()”>
df = contract.get_event_by_name('OrdersMatched')
df.grab("event_parameter.inputs.*", "dt").createOrReplaceTempView("opensea_order_matched")
spark.sql("""
grab dt,count(1),sum(designate) from
opensea_order_matched
the set dt> "2022-01-01"
community by dt
or

Read More

Charlie Layers
WRITTEN BY

Charlie Layers

Fill your life with experiences so you always have a great story to tell