-
Cryptocurrencies
-
Exchanges
-
Media
All languages
Cryptocurrencies
Exchanges
Media
DataBroker DAO is the first marketplace to buy and sell sensor data. As a decentralized marketplace for IoT sensor data using blockchain technology, Databroker DAO enables sensor owners to convert IoT sensor data using blockchain technology into generated data, Databroker DAO enables sensor owners to convert generated data into an income stream. This will open up a wealth of opportunities across industries. Data will be used and made more effective. The DTX token is a utility token on the Databroker DAO platform. DTX is an ERC20 compatible token with 18 decimal places. The token will serve as currency for buying and selling sensor data on the platform.
On the DataBroker DAO platform, the data buyer needs to spend the token DTX to obtain the data, the data provider will get 80% of the DTX, the telecom company will get 10%, and the DataBroker DAO platform will get 10%. In this way, the buyer easily obtains the required data, while the provider and service party get corresponding rewards.
At present, smart city is one of the most concerned directions of DataBroker DAO, and they are negotiating cooperation with the Dubai government. The Dubai government wants to improve the happiness of residents through smart cities. Through the DataBroker DAO platform, the Dubai government can easily obtain the data it needs. Another direction is smart farms. Farmers collect sensor data, which can increase productivity by 2% to 3% every year, and can also sell sensor data through DataBroker DAO to obtain additional income.
Thanks to the top development team and rich industry resources, DataBroker DAO quickly promotes the implementation of projects. More and more sensor manufacturers choose to join DataBroker DAO, and the ecology develops rapidly. As a unicorn enterprise in the subdivided field of sensor data trading, DataBroker DAO has successfully seized the industry opportunities. With the rapid iteration of the underlying technology of the blockchain, DataBroker DAO is bound to usher in explosive development!
Token-guided registration for reputation, quality and content management
A core component of the platform is the registration of sensors and data streams/files made available on the platform. In the DataStreamRegistry we will store all the data resources that provide streaming sensor data.
Streamed data can be real-time data from an IoT sensor. This data is sold by time span. A DataSetRegistry will hold "files" of data that can be purchased; these data are sold as downloads.
To list a stream/set in such a registration, the owner must hold (in this case submit/send/lock) a certain amount of DTX tokens. These tokens are locked by data sellers as a guarantee of good faith.
There will be minimum holding requirements to be permanently listed on the registry. Data sellers can hold more DTX tokens if they wish. The more tokens held, the more prominently these data streams/sets appear in the list (like sorting or additional badges in the interface), increasing the chances of being bought, and at the same time increasing the assurance to the buyer that the data quality is high and contains the information described in the promotion.
Buyers who are not satisfied with the quality of the data can challenge the records in the registry by holding some DTX tokens. This challenge will be reflected as a negative reputation score in the UI for all potential buyers. By itself, it will not have any impact on the sale of data.
When a certain threshold of questioning is reached, the DataBroker DAO administrator will perform a verification operation on the data provider. If it is found that there is a problem in the advertised data, the tokens it holds will be equally distributed to all doubters and the DataBroker DAO platform wallet. The record is then deleted from the record. If the data is deemed reliable, the tokens held by the challenger are distributed to the data seller and the platform.
Identity Management for 1 Billion Sensor Owners
Databroker DAO is a peer-to-peer marketplace for IoT sensor data. This data is generated by sensors, and we're talking billions of sensors. These sensors are again owned by a huge number of owners. These owners have contracted with network operators (they could be telecommunication companies or manufacturers) to transmit the data generated by their sensors to (mainly network-) gateways for use.
Network operators take on the role of gatekeepers regarding data flow through their gateways. They have gone through all required KYC procedures on the sensor owner and have authenticated and verified the sensor themselves. They also protect their networks from unauthorized use. Moreover, in most areas, network operators have a monopoly position resulting in a large number of partners, but compared to the number of owners or sensors, it has been greatly reduced.
As for Databroker DAO, it is a very beneficial solution to cooperate with these gateway operators. By controlling and authenticating gateway operators, the platform gains the means to manage and control a huge number of sensors through proxy rights.
This raises the question of managing the identities of sensors, owners and operators on the platform. Relying on an end-user identity management solution like uPort, the platform operates under a "regulated identity proxy" contract. These proxy contracts contain links to sensor owner wallets and addresses. Unlike end-user solutions, these proxy contracts are also tied to the owner's identity with the gateway operator and can be controlled by the gateway administrator.
This gives us the same full ownership as sensor owners, combined with the ability for gateway operators to control/automate their interactions with the system, and even the ability to handle end user private keys until proper key management systems are widely deployed And until it becomes popular. The system will be open source prior to the public token sale.
dAPP and dAPI
In the field of blockchain, most projects are built on the basis of distributed applications or dAPPs. These client applications interact directly with Ethereum or other blockchains. In many cases, these applications will run on remote shared nodes for future user experience, and while this is the only way to create user-friendly peer-to-peer applications, it has serious drawbacks for some of our use cases:< br> ● Single point of failure. During some recent token sales, client applications coupled with high demand have caused these shared nodes to go down. Not for lack of trial or skill, but because of the sheer number of PRC calls required to perform specific functions on Ethereum smart contracts. In high-stakes areas, such failures are unacceptable.
● The web interface and applications are beautiful, but the real value is in the API. In the current SaaS and cloud boom, this is almost something that is taken for granted. You don't have a real product unless you also have an API for it. Slack, Zapier, Github, CRM, and ERP systems all owe their success in part to his dedication to APIs.
● The more applications, the more problems. Adding more additional interfaces to it will only make it more difficult for the average user to use. The sensor owner has obtained an account with the carrier. They've learned how to use them and are happy (if they hadn't, they'd have switched carriers).
That's why we added what we call dAPI. Just like dAPP, it is an API application deployed on each node. The main use of this dAPI is on the part of gateway operators, data processors and large data buyers, rather than sensor owners or small-scale buyers. They will use the (off-the-shelf) connection provided by the network operator or the Databroker DAO dAPP.
Data Distribution and Storage
Billions of sensors generate enormous amounts of data. Therefore, any company that uses IoT sensor data has its own systems for processing that data and will likely not be inclined to replace their systems. This means we cannot force them to implement a new data storage system. What's more, the goal of the platform is not to permanently store all IoT sensor data.
Connectors built into dAPI integrate with leading IoT and large data storage providers, allowing buyers to choose where to send their data.
There is now a valid use case for blockchain to store such data. The immutability and timestamp features are worth mentioning. To benefit from these features, dAPI will store data on the Ethereum mainnet (using the Chainpoint spec).
Related links:
https://www.qukuaiwang.com.cn/news/6693.html
http://www.120btc.com/coin /1814.html
https://cloud.tencent.com/developer/news/248571
*The above content is organized by YouToCoin official. If reprinted, please indicate the source.