LogoLogo
WebsitePredictoorData ChallengesData FarmingOcean.pyOcean.js
  • 👋Ocean docs
  • 🌊Discover Ocean
    • Why Ocean?
    • What is Ocean?
    • What can you do with Ocean?
    • OCEAN: The Ocean token
    • Networks
    • Network Bridges
    • FAQ
    • Glossary
  • 📚User Guides
    • Basic concepts
    • Using Wallets
      • Set Up MetaMask
    • Host Assets
      • Uploader
      • Arweave
      • AWS
      • Azure Cloud
      • Google Storage
      • Github
    • Liquidity Pools [deprecated]
  • 💻Developers
    • Architecture Overview
    • Ocean Nodes
      • Node Architecture
    • Contracts
      • Data NFTs
      • Datatokens
      • Data NFTs and Datatokens
      • Datatoken Templates
      • Roles
      • Pricing Schemas
      • Fees
    • Publish Flow Overview
    • Revenue
    • Fractional Ownership
    • Community Monetization
    • Metadata
    • Identifiers (DIDs)
    • New DDO Specification
    • Obsolete DDO Specification
    • Storage Specifications
    • Fine-Grained Permissions
    • Retrieve datatoken/data NFT addresses & Chain ID
    • Get API Keys for Blockchain Access
    • Barge
      • Local Setup
    • Ocean.js
      • Configuration
      • Creating a data NFT
      • Publish
      • Mint Datatokens
      • Update Metadata
      • Asset Visibility
      • Consume Asset
      • Run C2D Jobs
    • Ocean CLI
      • Install
      • Publish
      • Edit
      • Consume
      • Run C2D Jobs
    • DDO.js
      • Instantiate a DDO
      • DDO Fields interactions
      • Validate
      • Edit DDO Fields
    • Compute to data
    • Compute to data
    • Uploader
      • Uploader.js
      • Uploader UI
      • Uploader UI to Market
    • VSCode Extension
    • Old Infrastructure
      • Aquarius
        • Asset Requests
        • Chain Requests
        • Other Requests
      • Provider
        • General Endpoints
        • Encryption / Decryption
        • Compute Endpoints
        • Authentication Endpoints
      • Subgraph
        • Get data NFTs
        • Get data NFT information
        • Get datatokens
        • Get datatoken information
        • Get datatoken buyers
        • Get fixed-rate exchanges
        • Get veOCEAN stats
    • Developer FAQ
  • 📊Data Scientists
    • Ocean.py
      • Install
      • Local Setup
      • Remote Setup
      • Publish Flow
      • Consume Flow
      • Compute Flow
      • Ocean Instance Tech Details
      • Ocean Assets Tech Details
      • Ocean Compute Tech Details
      • Datatoken Interface Tech Details
    • Join a Data Challenge
    • Sponsor a Data Challenge
    • Data Value-Creation Loop
    • What data is valuable?
  • 👀Predictoor
  • 💰Data Farming
    • Predictoor DF
      • Guide to Predictoor DF
    • FAQ
  • 🔨Infrastructure
    • Set Up a Server
    • Deploy Aquarius
    • Deploy Provider
    • Deploy Ocean Subgraph
    • Deploy C2D
    • For C2D, Set Up Private Docker Registry
  • 🤝Contribute
    • Collaborators
    • Contributor Code of Conduct
    • Legal Requirements
Powered by GitBook
LogoLogo

Ocean Protocol

  • Website
  • Blog
  • Data Challenges

Community

  • Twitter
  • Discord
  • Telegram
  • Instagram

Resources

  • Whitepaper
  • GitHub
  • Docs

Copyright 2024 Ocean Protocol Foundation Ltd.

On this page

Was this helpful?

Edit on GitHub
Export as PDF
  1. Data Scientists

Sponsor a Data Challenge

Sponsor a data challenge to crowdsource solutions for your business problems

Last updated 1 year ago

Was this helpful?

Hosting a data challenge is a fun way to engage data scientists and machine learning experts around the world to solve your real business problems. Incentivize participants to build products using your data, explain insights in your data, or provide useful data predictions for your business. Plus, it's a whole lot cheaper than hiring an in-house data science team!

How to sponsor an Ocean Protocol data challenge?

  1. Establish the business problem you want to solve. The first step in building a data solution is understanding what you want to solve. For example, you may want to be able to predict the drought risk in an area to help price parametric insurance, or predict the price of ETH to optimize Uniswap LPing.

  2. Curate the dataset(s) that participants will use for the challenge. The key to hosting a good data challenge is to provide an exciting and through dataset that participants can use to build their solutions. Do your research to understand what data is available, whether it be free from an API, available for download, require any transformations, etc. For the first challenge, it is alright if the created dataset is a static file. However, it is best to ensure there is a path to making the data available from a dynamic endpoint so that entires can eventually be applied to current, real-world use cases.

  3. Decide how the judging process will occur. This includes how long to make review period, how to score submissions, and how to decide any prizes will be divided among participants

  4. Work with Ocean Protocol to gather participants for your data challenge. Creating blog posts and hosting Twitter Spaces is a good way to spread the word about your data challenge.

To submit your application, kindly visit , and for more information, head over to this .

📊
here
page
Make the game, set the rules.