An official website of the United States government.

This is not the current EPA website. To navigate to the current EPA website, please go to www.epa.gov. This website is historical material reflecting the EPA website as it existed on January 19, 2021. This website is no longer updated and links to external websites and some internal pages may not work. More information »

Innovation

EcoTox TARGET Challenge

Technology Advancing Rapid Gene Expression-based Testing (TARGET)

TARGET Arrow

On this page:

Challenge

Ecotox Challenge SpeciesThe winner of this challenge will develop a technology for measuring global gene expression at a cost and scale that allows analysis of thousands to tens of thousands of samples per year. Target price point: $50 per sample or less.

This challenge calls for respondents to develop high quality, low cost, technologies/platforms for evaluating global gene expression in samples from four common aquatic toxicity test organisms: Pimephales promelas (a fish), Daphnia magna (a crustacean), Chironomous dilutus (an insect; formerly Chironomous tentans), and Raphidocelis subcapitata (a green algae). These represent the species and associated trophic levels most frequently tested when evaluating the ecological hazards of chemicals.

Challenge Partners: 

Top of page

Prize

  • Prize: $300,000 US
  • Prize Breakdown: There will be a single award of $300,000 made to the winning entry.

Top of page

Important dates:

  • Registration* Open/Close*: January 16, 2020/March 18th 2020 (Registration is now closed.)
  • Challenge End Date (submissions due): June 14, 2021
  • Winners Announced:  Fall 2021

* You must be registered for the challenge by March 18, 2020 to participate.

Top of page

Webinars

Top of page

Rules

  1. Eligibility:
  • To be eligible to compete for the award, prospective solvers must register at: EcoTox TARGET Challenge Registration, no later than March 18, 2020.

  • Eligible:  Individuals, or teams from private companies, academic institutions, non-governmental organizations, or independent research or technological institutes. The competition is open to both US and foreign citizens/organizations.
  • Not eligible:  U.S. or foreign government organizations.
  • Eligible:  Individuals or organizations that receive funding from U.S. or foreign government organizations are eligible to compete. However, funds from U.S. or foreign government organizations should not be used to directly fund the development of a solution requested under this Challenge.
  • Not eligible:  Individuals involved in development of award selection criteria or reference sample generation.
  1. Solvers must register by March 18, 2020 in order to take part in the competition.
  2. Required Information:
  • Technical point of contact for application (name, position, title, affiliation, contact phone number; contact e-mail address). This is the individual that will manage communications and coordination between the challenge sponsors (e.g., US EPA and partners) and the solver(s).
  • Listing of team members (including affiliations) and partner organizations (as appropriate). 
  1. Solvers must agree to the terms and conditions outlined below. Registration for the Challenge indicates the agreement of the Solver, and all team members, with the terms and conditions.
  2. Participating solvers must identify a financial point of contact by April 15, 2020.
    • Name, full address, and contact information for the organization or individual that will receive the award money and manage distribution of the funds if your solution is selected as the winner.
    • Note:  If you receive the award, a single lump sum will be provided to a single point of contact, identified above. That point of contact will be responsible for any further dispersal or distribution of the award money to the Solver’s institution and/or team members. Solvers are encouraged to establish and agree to an award distribution plan, in writing, prior to identifying the financial point of contact.
    • Neither U.S. EPA nor the U.S. Government are responsible to dispersal or distribution of the Award money to entities other than the financial point of contact explicitly identified by the Solver with the winning solution.
  3. Solvers may not share method development or reference samples provided by the Challenge Sponsors with individuals or organizations not listed as team members by the registered Solver (registration may be updated to add additional team members after the Challenge has started).
  4. In order to be eligible for the Award, Solvers must submit a technology description template (see helpful resources) and a processed data file for the reference sample provided by the Sponsors.
    • Processed data should be provided as a single table with a column for each test sample and a row for each gene or measurement feature (e.g. probe), with values representing a measure of absolute or relative expression of each gene/feature in each sample, intended for differential expression and signal:noise analysis. Solvers must also provide a clear, unambiguous mapping from columns to provided sample IDs/method variants, and from rows to gene IDs in the transcriptome annotation for each species. This table should be provided as either an Excel document, tab-delimited text, or CSV file.
    • Solvers should also document in detail the steps taken to generate the final processed data table from the raw data, and note any software or internal data that is proprietary (e.g. that solver would not include when publishing results from this platform; see Technology Description below).
    • Solvers should be willing to furnish raw data upon request by the Sponsors.
  5. Judging and Award Selection:   A panel of subject matter experts selected by EPA and the Challenge partners will judge  submissions based on pre-defined scoring criteria. EPA will make final determinations and select the award winner.

Winning Entries:

Will provide a platform for measuring global gene expression at a cost and scale that allows analysis of thousands to tens of thousands of samples per year. Target price point: $50 per sample or less.

Respondents will be asked to evaluate global gene expression in samples from four common aquatic toxicity test organisms (provided by the US EPA and Partners): Pimephales promelas (a fish), Daphnia magna (a crustacean), Chironomous dilutus (an insect; formerly Chironomous tentans), and Raphidocelis subcapitata (a green algae). These represent the species and associated trophic levels most frequently tested when evaluating the ecological hazards of chemicals.

The technology must be capable of the following:

  • Precisely quantifying the expression individual genes as reflected by relative mRNA abundance.
  • Providing complete or near-complete coverage of the expressed genome. This may be achieved through direct measurement of all expressed genes, or using validated sentinel gene sets whose expression can be confidently used to infer the response of the rest of the genome.
  • Facilitating gene-specific annotation suitable for linking target expression with target functions, and readily accommodating updates as annotations evolve and improve over time.
  • Performing analyses with limited sample masses necessary to make them compatible with high throughput testing protocols with small organisms or cells (e.g., <0.5 µg total RNA per sample).
  • Generating data in formats compatible with a standardized and automated quality assurance and data analysis workflow that can be used for dose-response modeling and differentially expressed gene/pathway analysis.
  • Meeting a level of quality, performance, and transcriptome coverage that is also economically and commercially viable for high throughput screening applications.

Also see EcoTox TARGET Challenge Rules.

Top of page

Judging Criteria:

  1. Quality and performance of the platform and associated data (40%)
  2. Economic and commercial viability of the platforms (30%)
  3. Transcriptome coverage (30%)

Scoring Overview

Scoring will be based on the weighted (% of total score) criteria provided in the tables below. Each criterion is scored based on either a nominal or fractional scale of 0-5 with 0 being the lowest and 5 being the highest, or by pass/fail with 5 being pass and 0 being fail.

1. Quality and performance Points Weighting
  • Quality control:  Does the platform contain a quality control system that addresses consistency within and between samples and is consistent with current standards used within various platforms for transcriptomic analyses?
    • E.g., Microarray chip-based platforms should contain hybridization controls, redundant positional controls to evaluate edge effects, etc.
    • E.g., RNA-seq platforms provide number of reads per sample, base quality score by cycle, nucleotide distribution by cycle, GC content, etc.
    • Note – scoring for this category will require subjectivity from the judging panel

0 to 5

10%

  • Data collection/extraction: Are the data collection/extraction methods and expression normalization/quantification methods described in adequate detail? Are they compatible with the ToxCast high throughput transcriptomics data analysis pipeline?

0 to 5

5%

  • Precision: Precision will be determined by evaluating 1) coefficients of variation of gene expression values across unblinded technical duplicates, 2) correlation analysis of fold-change profiles between selected reference samples, resulting in a metric of concordance, and 3) clustering unblinded reference samples when analyzed together with all conditions. Results from these three analyses will be normalized and merged into a multiplier between 0-1 that will be used to determine the total score between 0-5.

multiplier x 5

10%

  • Accuracy: Accuracy will be determined by evaluating 1) the percent concordance between fold-change values determined by Solver data compared to the fold-change values determined by pre-qualification. Results to determine the total score between 0-5.

0 to 5

10%

  • Quantity of RNA: Was the quantity of reference RNA used per analysis tracked and reported (Y = points awarded; N=0 points)?
    • Results generated using < 1 µg total RNA = 2 pts
    • Results generated using < 0.25 µg total RNA = 1 pt
    • Results generated using < 0.1 µg total RNA = 1 pt
    • Results generated using < 0.01 µg total RNA = 1 pt

 (0 to 5)

5%

2. Economic and commercial viability Points Weighting
  • Economic viability (i.e., cost per sample, including downstream data analysis cost)
    • Is the cost of sample preparation and per sample supply and reagent costs for conducting the sample analysis and generating the data provided? If proprietary downstream data analysis software is required, include a per-sample adjustment to overall sample cost based on software license cost.
    • Total per-sample cost is
      1. $20 or less = 5 pts
      2. >$20-$30 = 4 pts.
      3. >$30-$50 = 3 pts.
      4. >$50-$75 = 2 pts.
      5. >$75-$100 = 1 pt.
      6. >$100 = 0 pts.

0 to 5

20%

  • Commercial viability and throughput capability
    • Is there a reasonable demonstration/description of how and when the Solver would be able to meet the potential throughput requirements of HTP sample generation? 

0 to 5

10%

3. Coverage Points Weighting
  • Approach and annotation
    • Is the approach taken for detection and quantification of transcript expression adequately described? (e.g., whether the platform employs a targeted or non-targeted analysis and the general means by which the platform detects and quantifies transcript presence and abundance)
    • Are annotation files provided with each platform/species that contain the required information and link to the data files?

0 to 5

10%

  • Transcriptome coverage
    • What proportion of transcriptome coverage do the platforms have in relation to the pre-qualification standards? The mean percent coverage will be calculated across the four species’ platforms and used as a multiplier to determine point value.

multiplier x 5

20%

  • Species coverage
    • Did the solvers provide a platform and reference sample data for all four species?

Y or N

*100%

*Eligible submissions will include platforms, data and associated required information for all four species.

Top of page

Terms and Conditions

  • Solvers will not receive compensation for resources or time invested in addressing the challenge. Only the winning solution will receive a cash award.
  • Only the top-ranked solution will receive the award.
  • Solvers retain their rights to all intellectual property (e.g., details and design of their technology) that may be disclosed to the sponsors over the course of the challenge. Technical details and designs will not be disclosed or published without permission from the technical point of contact named in the registration.
  • Sponsors retain the right to disclose reference sample data, performance criteria, and other evaluation criteria summarized in the technology description template to provide a transparent reporting of how the winning solution was selected.
  • Sponsors retain the right to publish, present, and/or otherwise publicize results of the challenge competition that does not involve the disclosure of intellectual property of the Solver(s). Solvers will be afforded to review publications, presentation, or other publicity in order to protect against unwanted disclosure of intellectual property.
  • Solvers reserve the right to remove themselves from the competition at any time, up to final submission of results for evaluation, by notifying the sponsor in writing. The technical point of contact must make the request in writing on behalf of his/her team.
  • Registration for the challenge does not confer any obligation to deliver results.  However, any solvers removing themselves from the competition prior to evaluation forfeit the rights to publish results obtained for the reference samples supplied for the competition without written consent from the challenge sponsors.
  • Solvers that do not submit their results and technology description template by the submission deadline will be automatically removed from the competition and subject to the same terms as if they had forfeited in writing. The submission deadline may be extended at the discretion of the sponsors, but any extension will apply to all registered solvers.

Non-Endorsement:

EPA and EPA officials do not endorse any product, service or enterprise that may appear in submitted videos. Furthermore, by recognizing winning videos, EPA is not endorsing products, services or enterprises that may appear in those videos.

Funding Restrictions:

  • Challenge-solvers cannot use funding from the federal government (either through grants or contracts) to compete in the Challenge.
  • All prize awards are subject to EPA verification of the winners’ identity, eligibility and participation in the Challenge. Awards will be paid using electronic funds transfer and may be subject to federal income taxes. EPA will comply with the International Revenue Service (IRS) withholding and reporting requirements, where applicable.

Plagiarism:

EPA has a no-tolerance policy for plagiarism. Any applicant whose winning work is determined to be plagiarized in whole or in part will forfeit any awards.

Assistance Waiver:

By entering this Challenge, participant agrees to assume any and all risks and waive claims against the federal government and its related entities (except in the case of willful misconduct), for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from their participation in the Challenge, whether the injury, death, damage, or loss arises through negligence or otherwise.

Disclaimer:

EPA reserves the right to disqualify and or clarify any submittal.

Top of page

Helpful Resources

Top of page

Contact

If you have questions about the EcoTox TARGET Challenge email Dan Villeneuve.

To help raise awareness of the challenge, please use #EcoToxTARGETChallenge in your social posts.