99 Problems But NFTs Ainât One
Ask most people what they know about NFTs, and itâs likely youâll hear a lot about digital art, tokenized authenticity, proof of ownership, and/or a hot new asset class (or, depending who you hang out with, commodity fetishism and the death knell of late-stage capitalism). But hot takes aside, todayâs market for generative artwork is only the tip of the iceberg.
Whatever the use caseâand the possibilities are vastâtodayâs NFT ecosystem is generating mountains of largely unstructured data, both on and off chain. And without established technical standards to ensure clear structure and verifiability, itâs all too easy for that data to become the stuff of developer nightmares.
More possibilities, more problems.
As use cases for NFTs continue to expand beyond 1/1 works of art and 10k pfp collections, developers building applications that integrate on-chain tokens face a widening maze of challenges around data management and queries. . The power of an NFT lies in its ability to represent any unique entity, with todayâs common use cases including in-game tokens, editioned generative artwork, event ticketing, and even physical works or goods. Blockchain and non-fungible tokens, in this case, make use of smart contracts, which can be written in a variety of programming languages, using varying methods of metadata storage and retrieval. The digital asset to which a given piece of metadata refers, and the metadata itself, can each be located almost anywhere in the decentralized webâwhich is itself both immeasurably large and continuously expanding.
Combine a diversity of programming languages and an absence of standardization, and you end up with some novel problems, mostly involving how to handle the sheer volume of data generated, how to locate it in the many places it might be stored (both on chain and off), and how to handle inconsistent data formats and structures.
The Laconic Network was created to address these challenges through shared standards that make it possible for DApp developers to quickly and intuitively integrate and manage disparate NFT data and assets. In this piece, the first of a two-part series, we look at five major NFT implementation and integration issuesâalong with what weâre doing to solve each one, so you can sleep at night.
1. NFT data lacks consistency and integrity.
The functionality of an NFT lies in the metadata describing the individual item it represents. That metadata can consist of traits describing the characteristics of a jpeg artwork, essential information about copyright and intellectual property, guidelines for the itemâs intended presentation, or all of the above and more. Metadata can also address a broad set of questions: Am I allowed to use an NFT for commercial purposes based on IP and copyright? Can I play a specific video, given its file format and codec? Are there readable methods for rendering this generative work? Can I easily list this NFT on the larger NFT marketplaces, where I have the best chance of selling it for the best price and in a timely manner?
The rapid growth of the NFT market has further expanded the possible functions of metadata. Developers need an efficient way to retrieve all types of metadata, and to maintain correlation with their on-chain counterparts in provable, hash-linked data structures. Laconic WatchersâAPIs that serve data from the Laconic Networkâfill this need. The Watchersâ custom search and caching services collect variously constructed data and combine it into a unified form that DApps can interpret and use, without sacrificing data integrity.
2. NFT data is fragmented and scattered.
Itâs not practical to use current blockchain technology as a data storage or retrieval protocol. It was designed primarily as a means for achieving trustless consensus, not as a data availability system. The assets to which NFTs refer are often stored in protocols such as Arweave and IPFS, and pointed to by the NFTâs âtokenURI.â Â In one recent example, complete rendering libraries are stored as compressed, on-chain data URIs; smart contracts then access these libraries to render fully on-chain generative artworks.
A problem emerges, however, whenever a DApp needs to access any of this information. Methods for retrieval and DApp ingestion vary wildly depending on data type and location. And while RPC services can locate dataâfor a priceâin most cases thereâs no measurable way to ensure its integrity.
Despite being the most lightweight element of the Laconic Stack, the Laconic Watcher has the power to alleviate the proof issue, by preserving evidence of proof across data transformations while querying a far smaller subset of data than is typically required.
3. For both NFTs and equivalent token formats, data types vary from blockchain to blockchain.
Most of todayâs NFTs are based on Ethereum, and most commonly written in Solidity. But zoom out for a wider view of the possibilities for both token types and blockchains, and the problem space increases correspondingly. On Tezos, for instance, smart contracts are most often written in SmartPy or LIGO, with third-place Michelson being a common low-level, domain-specific language. On Solana, Rust, C, and C++ are commonly used to compose smart contracts (referred to in the Solana ecosystem as âprograms).â Itâs safe to say that we have more than a small naming, language, and methodology mess on our hands in the blockchain ecosystem!
Letâs imagine a DApp that tracks all NFTs representing a certain kind of mediaâbooks, for exampleâto provide an index of on-chain literature. It would need the ability to accurately interpret smart contracts from each chain, across widely varying programming languages, syntax, and metadata standards.
The Laconic Network drastically simplifies this process, offering developers a unified view of data while allowing DApps to agnostically query data from multiple blockchains from a decentralized, content-addressable database.
4. NFT games require lightning-fast retrieval and scalability.
The volume of NFT transactions is expected to rise significantly with the current influx of institutional Web 2 players into the Web 3 ecosystem. The problem: As transaction volume and speed increase, data availability with censorship resistance and proof of integrity becomes increasingly unsustainable. And blockchain-based games alone are poised to send NFT transaction volumes to stratospheric heights, with more and more in-game events and transactions driving mounting network traffic. For example, the popular game Gods Unchained is already generating significant transaction volumeâand itâs just one of countless on-chain games.
The growth of games and social apps using on-chain transactions continues to expose issues with Ethereumâs scalability and speed. The problem is compounded by the fact that most indexing services are typically a few blocks behind with updatesâtoo far back to have DApps react to in-game events on time. And that leaves players holding the bag, subjected to unnecessarily clunky and unwieldy gaming experiences.
Unlike traditional blockchain indexing services, the Laconic Network is equipped to provide up-to-date blockchain data with trivial delay, for scenarios in which real-time data retrieval is essential to user experience.
5. Data retrieved from multiple locations is difficult to verify.
A fundamental promise of blockchain is verifiability. And while on-chain data is verifiable per se, NFT data can be stored in any number of data repositories. Meanwhile, using that data requires intermediaries such as DNS system records, traditional web servers, and files stored in Web2 datacenters. Every one of these exposes the data to the possibility of censorship, manipulation by bad-faith actors, or simple disappearance. In such scenarios, associated NFT data is most often tied to a token via a âtokenURIâ that references the location of a JSON file containing token metadata. That file, in turn, Â lives in one of these off-chain data storage protocols.
Typical RPC providers can retrieve information about such data, then provide it to a client. A wallet client, for example, can find any media files associated with a particular NFT and display them. But this creates a trust bottleneck in the Web 3 ecosystemârequiring DApps and their users to trust the source of the data without proof. Solving this trust point is one of the primary goals of Laconic.
Todayâs NFT ecosystem isnât providing standards. Enter Laconic.
Itâs clear that across blockchains, tokenization standards vary widely. Even with similar token types on the same chain, we see countless examples of how and where data is stored:
Stay tuned for Part 2, where weâll dive into more detail on how Laconic solves common implementation and integration challenges, offering both collectors and developers a far smoother NFT experience.
Got questions? Join us on Discord.