At the forefront of innovation, our mission is to revolutionize the future of sports fandom by empowering fans with unique, verifiable digital collectibles – NFTs – featuring professional athletes. Participant NFTs revolutionize the sports collectibles market by offering a cutting-edge alternative to traditional digital baseball cards.
We’re poised to prepare the groundwork. Follower and traders can track and monitor NFL and NBA player non-fungible tokens (NFTs) via our comprehensive service. We also introduce a platform for acquiring and showcasing non-fungible tokens (NFTs). Here’s an improved version: Consider a platform akin to Ameritrade or Coinbase, but focused on sports activity-based NFT assets.
Most notably, we’ve also launched a program called The Homeowners Membership, which kicked off with a fantasy football league during the 2021-22 season.
We awarded a total of $1.5 million in prizes to winners primarily based on their NFL fantasy teams, which were comprised of player NFTs they owned.
As collecting diverse types of NFTs has evolved into a form of art connoisseurship, we’re revolutionizing the experience by gamifying sports NFTs, thereby infusing them with increased functionality and enjoyment. This development also fosters greater opportunities for astute traders to generate revenue by acquiring and marketing participant NFTs.
As the CTO of a small startup, this endeavour proves exceptionally captivating, requiring me to spearhead the development of an information architecture that underpinned.
- During peak periods of game days in a fantasy sports activity league, data ingestion and concurrent usage surge.
- Real-time participant leaderboards showcasing tangible results.
- Participants’ NFT data is securely stored on the Ethereum blockchain, ensuring a safe, environmentally friendly, and rapid access to this valuable information.
- Financial reporting is often used internally by organizations for various purposes, including:
Budgeting and forecasting? tracking financial performance against plans
Identifying trends and areas for improvement through variance analysis
Compliance with regulatory requirements and internal controls assessments
Internal decision-making and resource allocation? strategic planning and goal setting
It’s a tall order. As the Net 3.0 home continues its rapid evolution, it’s hardly surprising that the initial iteration of our information infrastructure struggled to meet the demands placed upon it. Fortunately, we were able to rapidly adjust course once we discovered a real-time analytics platform optimized for our rapidly shifting needs.
DynamoDB: Analytics Limitations Revealed
In February 2021, I co-founded Personal, a company that had previously operated under the radar in stealth mode. To establish a successful fantasy sports platform and NFT marketplace, we require access to two primary data sources:
- Real-time recreation scores and participant statistics, sourced from external data providers.
- Blockchain nodes are equivalent to servers that enable multiple parties to concurrently read and write data about non-fungible tokens (NFTs) and customer cryptocurrency wallets onto a distributed ledger.
I designed and built the core architecture of our data infrastructure around Amazon’s scalable and reliable DynamoDB database in the cloud. Our database, DynamoDB, effectively ingested and stored exterior information within a single table. Our supplementary infrastructure included smaller DynamoDB tables dedicated to storing consumer data and the intricate mechanics governing our fantasy sports contests. In addition to our traditional weekly prime group competitions and comprehensive participant rankings, we also hosted contests tailored to underperforming groups, ensuring that users holding subpar NFTs still had a chance to succeed.
To successfully execute these contests, we aimed to leverage the DynamoDB data tables effectively. Due to the diverse range of contests, we received numerous distinct inquiries. That’s where DynamoDB’s analytical limitations became starkly apparent.
To ensure timely execution of DynamoDB queries, we initially focused on creating a purpose-built secondary index utilizing a unique key tailored to the specific query in question. DynamoDB, a cloud-based NoSQL database, lacks support for traditional SQL operations, specifically those that enable joining multiple tables, unlike relational databases. To optimize our operations, we had to seamlessly integrate our primary DynamoDB database by retrieving the comprehensive customer data stored across distinct DynamoDB tables. This system had significant drawbacks, including the challenge of maintaining precise and up-to-date information over extended periods of gameplay, as well as the need for additional storage space to accommodate redundant data within our primary database. This complex endeavour necessitates an adept developer well-versed in the intricacies of DynamoDB data analysis. They’re an uncommon and expensive group.
The thrown-in combination featured an object-relational mapping (ORM) tool that had been successfully deployed. With Dynamoose, developers gain access to both programmatic APIs and schema definitions for schema-less data in DynamoDB, fostering flexibility and simplicity. Despite this added modelling complexity, we must accept a substantial increase in query latency as a result. As a consequence of that, we experienced
Making an attempt to utilize DynamoDB for rapid analytics proved to be a daunting and potentially endless ordeal. As the NFL season approaches in less than a month, we found ourselves in a difficult situation.
A Quicker, Friendlier Resolution
We considered a few alternative solutions. To create an additional information pipeline that could potentially merge information because it was ingested into DynamoDB. Would this project necessitate building an entirely new workspace from scratch, potentially entailing an additional two-week development window? One alternative way to scrap DynamoDB is to switch to a traditional relational SQL database? Several would have necessitated a considerable amount of effort.
Upon identifying Rockset as a valuable resource, we promptly initiated the development of a cutting-edge, customer-facing leaderboard, built entirely on Rockset’s capabilities. One of the initial challenges our team encountered was the ease with which we could leverage Rockset’s intuitive interface. Over the past 12 years, I have worked closely with almost every major database system available. Rockset’s user interface is genuinely the most efficient one I’ve had the pleasure of working with.
The SQL question editor stands out as a top-notch tool, meticulously tracking user history, preserving queries for easy recall and additional features. The database application it provided to my team of skilled developers – comprising six individuals with a strong grasp of SQL – enabled them to become immediately operational. Based on their analysis of the SELECT statements and joins, they grasped the type of data they were working with and how to manipulate it effectively. By the end of the day, they had successfully built functional SQL queries and APIs without requiring any external assistance. With Rockset’s™ robust technology and seamless scalability, every query is executed quickly and reliably, ensuring optimal performance. We shouldn’t construct a customised index for each question like we do with Dynamo?
By leveraging Rockset, we significantly reduced the time spent trying to work around DynamoDB’s analytical constraints, freeing up valuable resources that would have been consumed by manual data manipulation.
Developer productivity is nice, but what about developer efficiency? It was there that Rockset truly excelled. Upon seamlessly migrating all query feeds powering our leaderboards to Rockset, we rapidly gained the capability to interrogate our data within a remarkably swift 100 milliseconds or fewer. At a minimum, that’s a 30-fold increase in velocity compared to using DynamoDB.
additionally made scalability very easy. Given the constraints of our highly dynamic utilisation, it was crucial to optimise both efficiency and value. During the initial season, usage surged to a remarkable 20 times higher during prime-time events on Mondays, Thursdays, and entire Sundays, compared to off-peak periods. By simply recalibrating the scope of our Rockset event during playtime, we can ensure a seamless experience without worrying about bottlenecks or timeouts.
With Rockset’s impressive velocity, scalability, and user-friendly interface, we quickly transferred the rest of our analytics operations onto its platform. The database comprises a total of ten distinct information collections, with the most significant one housing an impressive 15 million data points. This flagship dataset is complemented by a range of other essential information repositories.
- In our inaugural year, a staggering 65,000 NFT transactions have yielded a remarkable total value of $1 million.
- Our database currently holds information on the 23,000 existing customers in our system, as well as details on the 160,000 unique NFTs they personally own.
- Our most comprehensive database yet – a treasure trove of over 400,000 blockchain-derived data points specifically linked to non-fungible token (NFT) transactions tied to our intelligent contracts.
DynamoDB serves as our primary database for reports, seamlessly integrating with microservices, synchronizing data with blockchain technology, and processing real-time information feeds in a continuous flow. However, with Rockset’s robust infrastructure, every aspect of information retrieval and analytical processing is streamlined, allowing for seamless access to participant NFT market data, comprehensive pricing statistics, and transaction records, as well as efficient handling of consumer queries. Rockset synchronizes with DynamoDB in real-time, refreshing data on recreation scores every 5-10 seconds, and updates the blockchain to reflect changes to NFT and user wallet information, storing this data within a searchable collection.
We also leverage Rockset to streamline our internal administrative reporting processes. By leveraging Rockset’s powerful joining capabilities, we seamlessly merge market, consumer, and funds data from distinct DynamoDB tables, ultimately yielding a rich tapestry of insights that are then exported as actionable CSV records. With our capabilities, we were able to generate such experiences in mere minutes thanks to the.
Constructing this in DynamoDB, however, would have necessitated the creation of scripts and guides, which are inherently error-prone. By leveraging Rockset, we likely shaved off days or even weeks from our development timeline. This feature also allowed us to host giveaways and contests for customers who possess a complete set of NFTs in our platform or have made significant purchases totaling X dollars within the marketplace. Without Rockset, aggregating our rapidly growing collection of DynamoDB tables would have entailed a dauntingly significant amount of labor.
Future Plans
In our final season, we distributed a total of $1.5 million in prize awards. It was actual cash that lay scattered on the deserted highway. Notwithstanding, this was essentially a proof of concept for our Rockset-based analytics platform, which performed flawlessly throughout. We have successfully reduced the number of question errors and timed-out queries to zero. Questions quickly run out of bounds. With our rigorous efforts, we’ve significantly reduced our average response time from a lengthy six seconds to an impressively swift 300 milliseconds. Regardless of dataset size, this statement holds true.
Additionally, Rockset enables my builders to work at an unprecedented level of productivity, thanks to its intuitive user interface and seamless SQL integration. Converged indexing and question optimisation eliminate the need for engineers to devote valuable time to query performance.
As the NFL season approaches, our team is conducting in-depth interviews with prominent figures from the world of sports media and fantasy. As a result of our platform is uniquely positioned to integrate blockchain with a utility-based NFT solution, attracting projects seeking seamless integration between these technologies.
We’re also actively working on a range of backend enhancements, including the development of new APIs within Rockset and the integration of new capabilities. As we gear up for exponential growth in all aspects – from our customer base to participant NFTs, data insights and more – What gained’t change is Rockset. It’s officially verified that this solution effectively caters to our diverse requirements: delivering lightning-quick, adaptable, and sophisticated analytics capabilities that are both easy to build and cost-efficient to manage.