Dynamic Tactical Targeting: Tactical Exercises and System Testing (DTT:TEST)

The summary for the Dynamic Tactical Targeting: Tactical Exercises and System Testing (DTT:TEST) grant is detailed below. This summary states who is eligible for the grant, how much grant money will be awarded, current and past deadlines, Catalog of Federal Domestic Assistance (CFDA) numbers, and a sampling of similar government grants. Verify the accuracy of the data FederalGrants.com provides by visiting the webpage noted in the Link to Full Announcement section or by contacting the appropriate person listed as the Grant Announcement Contact. If any section is incomplete, please visit the website for the Defense Advanced Research Projects Agency, which is the U.S. government agency offering this grant.
Dynamic Tactical Targeting: Tactical Exercises and System Testing (DTT:TEST): DYNAMIC TACTICAL TARGETING: TACTICAL EXERCISES AND SYSTEM TEST (DTT:TEST), SOL BAA 05-04, DUE: 29 November 2004; POC: Dr. Robert R. Tenney, DARPA/IXO; FAX: (703) 741-0081. The Defense Advanced Research Projects Agency's (DARPA) Information Exploitation Office (IXO) is soliciting proposals for the Dynamic Tactical Targeting: Tactical Exercises and System Test (DTT:TEST) program under this Broad Agency Announcement (BAA). BACKGROUND: The Dynamic Tactical Targeting (DTT) program currently is developing technology to continuously cross-cue a large set of heterogeneous, partially controllable sensor platforms to maintain track on known targets, while maintaining search efforts to find new candidate targets, so that known targets can be held at risk until a commander authorizes engagement. Abstractly, the sensors can be viewed as a pool of resources, and mission needs (search an area, maintain track on a designated vehicle) as prioritized demands. DTT is the process that continuously, and proactively, assigns resources to demands as the situation changes: new missions are defined, known vehicles move, new vehicles are found, existing tracks degrade, etc. The DTT program began in 1999, and is ending this year. It developed an initial set of software components, integrated into a complete, closed-loop system, that clearly validated the above premise, but validated the premise only on simulated data. Simulated data simply cannot contain all of the artifacts and error sources present in the real world. The Dynamic Tactical Targeting: Tactical Exercises and System Testing (DTT:TEST) program has been created to validate the premise in live exercises, leveraging companion efforts in the Army (affiliated with the Future Combat System) and Air Force (affiliated with the Distributed Common Ground Station). PROGRAM OBJECTIVE: The DTT:TEST program will validate the premise that we can employ automated tools to 1) align the transformed sensor data to a common geospatial reference system; 2) correlate data across sources into consistent target tracks; 3) predict future target motion; 4) create sensor-specific tasking that develops the most effective way to employ sensors in the context of anticipated target motion (for targets in track) and remaining search tasks (for targets yet to be discovered); and 5) allow a commander and her staff to maintain situation awareness and supervise the operation of the automation. The DTT:TEST program will build and integrate technologies for these five areas, and exercise them on real-world sensor data, both recorded and live. Some of these technologies may be drawn from those funded by the original DTT program, but IXO is aware of, and interested in, other, equally mature technologies that may outperform those employed by DTT to date. Validation will be accomplished in live exercises, leveraging companion efforts in the Army and Air Force. PROGRAM STRUCTURE: The DTT:TEST program contains five coordinated, technology-intensive components developed in parallel, and two system integration and evaluation efforts . This BAA solicits proposals to develop these technology areas, using novel technologies to achieve the functional capabilities described in the Proposal Information Pamphlet (PIP), available at the URL provided below. The objective of DTT:TEST is to develop the highest and most robust level of automation possible for these components, recognizing they will always be supervised by a commander and her staff. The seven program elements are: (1) Registration: Use reported sensor/platform status reports, correlations between sensor data and fiducials, and correlations of sensor reports to friendly vehicle status reports to estimate parameters associated with systematic sources of error in sensor/platform reporting processes. Continuously adjust current and historical sensor data to eliminate these errors. Modify reported confidence information to account properly for this adjustment. (2) Prediction: Given an initial area in which a vehicle is hypothesized to exist, and some indication of the vehicle type, use terrain and environmental knowledge to compute the area, as a function of time, which that vehicle can reach. Ensure that the predicted area is valid for both benign and evasive targets. (3) Fusion: Combine report- and track-level data into a consistent set of target tracks and accompanying state estimates, both current and historical. Properly account for reporting delays between sensors and the DTT system, and processing artifacts (e.g., statistical correlation) introduced along that path. Allow the use of vehicle-specific identity information (?fingerprints?) to correlate reports and tracks separated widely in time. Regularly and frequently update target state estimates to maintain a current situation estimate. (4) Control: Given a set of missions, partial knowledge of targets expressed as target state estimates and predicted location envelopes, an ISR force consisting of fixed and mobile sensors, a set of operating constraints, and knowledge of terrain and weather, construct kinematically feasible platform routes and sensor tasking schedules to maximize mission accomplishment. After approval or modification by commanders, publish the routes and schedules to the affected systems. Continuously update the routes and schedules as target estimates change, as platforms and sensors execute their missions, and as mission commanders refine mission definitions. (5) Command: Continuously update situation estimates derived by the DTT system on a physically separated, networked set of command workstations. In this context, elicit ISR mission needs from mission commanders through a shared, collaborative workspace. Transform graphical and tabular input data into a formal machine representation capable of guiding sensor tasking. Display system-derived sensor routes and schedules for approval or modification. Track progress against each mission, and generate alerts when missions appear infeasible. (6) System Design and Integration: Establish common data representations that support information exchange among the technical components; develop a DTT:TEST system architecture and supporting intra-component software interface; build and operate an unclassified, internet-accessible software integration facility; and design and conduct a rigorous system-level test regimen to verify correct functional behavior. (7) Experiment Design and Evaluation: Define system- and component-level metrics; design instrumentation to obtain data from field instrumentation and components to compute those metrics; maintain and operate a classified DTT:TEST exercise facility (in Building 620, Wright-Patterson Air Force Base); establish secure, low-latency connectivity between that facility and the exercise partners. PROGRAM PHASES: The DTT:TEST program will be conducted as a single, 24-month effort . For planning purposes, assume the program extends from 1 January 2005 through 31 December 2006. Within the 24-month period, the program will proceed in quarterly (3-month) development spirals. Each spiral will involve: (1) Definition of system-level functionality upgrades; (2) Updates to interface control documents; (3) Definition of test coverage and schedule at the integration facility; (4) Delivery of intermediate component software, nominally monthly, to support testing; and (5) Delivery of a final version of the system to the classified exercise lab. Exercises will be conducted on an opportunistic basis, managed to conform to schedules of other activities sponsoring the exercises. Each exercise will be conducted using the then-current version of the system software in the classified lab. In addition, the classified lab will contain a Government-supplied simulation capability to support functional testing and evaluation. At the end of each spiral, the Experiment Design and Evaluation effort will deliver quantitative assessments of component and system level performance. SELECTION CRITERIA: Proposals will be selected through a technical/scientific/business decision process, with technical and scientific considerations being most important. Evaluations will be performed using the following criteria, listed in descending order of relative importance: (1) Relevance to DTT:TEST mission objectives; (2) Consistency with DTT:TEST program concepts; (3) Technical innovation and depth; (4) Personnel and corporate capabilities and experience; and (5) Cost realism and value of proposed work to the Government. Further details may be found in the PIP, which can be accessed at the e-mail address provided in GENERAL INFORMATION below. Proposed research should investigate innovative approaches and techniques that lead to or enable revolutionary advances in the state-of-the-art. Proposals are not limited to the specific strategies listed above, and alternative visions will be considered. However, proposals should be for research that substantially contributes towards the goals stated. Research should result in prototype hardware and/or software demonstrating integrated concepts and approaches. Specifically excluded is research that primarily results in evolutionary improvement to the existing state of practice or focuses on a specific system or solution. Integrated solution sets embodying significant technological advances are strongly encouraged over narrowly defined research endeavors. Proposals may involve other research groups or industrial cooperation and cost sharing. This BAA shall remain open and proposals accepted up to one year following this BAA?s release. GENERAL INFORMATION: There WILL NOT be a Briefing to Industry (BTI) for this solicitation. Proposal abstracts ARE NOT requested in advance of full proposals. DARPA will employ an electronic upload process for proposal submissions. Proposers may find submission guidance at: http://www.darpa.mil/ixo/solicitations/dtttest/index.htm. Organizations must register at: http://www.tfims.darpa.mil/baa to propose. One registration per proposal should be submitted. Organizations wishing to submit multiple proposals should complete a single registration for each proposal. The deadline for registration is 22 November 2004 at the registration URL listed above. By registering, the Proposer has made no commitment to submit. Proposal Submissions must be unclassified. Proposers must be willing to cooperate and exchange software, data, and other information with other contractors in a manner that contributes to the success of the program. This includes routine coordination with other contractors, especially the System Evaluator contractor and the System Design contractor. A statement of cooperation must be included in the final proposal. These matters are fully described in the PIP. REQUIREMENTS/PROCEDURES: The Award Document for each proposal selected and funded will contain a mandatory requirement for submission of certain status reports. These reports shall be electronically submitted via the DARPA/IXO Technical ? Financial Information Management System (T-FIMS), utilizing the government furnished Uniform Resource Locator (URL) on the World Wide Web (WWW) at http://www.tfims.darpa.mil/baa. Further details may be found in the PIP. PROPOSAL DELIVERY: Proposals must be uploaded no later than 12:00 Noon EST, 29 November 2004 to be considered for funding. While BAA 05-04 will be open until 14 October 2005, DARPA expects to make all anticipated awards based upon proposals submitted by the above upload deadline. Please reference the URL provided in the GENERAL INFORMATION section, above, for complete submission instructions. PROTECTION OF INFORMATION: It is the policy of DARPA to treat all proposals as competitive information and to disclose contents only for the purposes of evaluation and assessment. The Government may use selected support contractor personnel from SET Associates, Schafer Corporation, CACI International, and McNeil Technologies to assist in administrative functions only. Those contractors sign binding, non-disclosure agreements with DARPA. TECHNICAL AND ADMINISTRATIVE INQUIRIES: DARPA intends to use electronic mail for correspondence regarding BAA 05-04. Technical, contractual, or administrative questions must be received at [email protected] by 12:00 Noon EST 15 November 2004. Answers to all questions generally relevant to the technical, contractual, and administrative aspects of the solicitation will be posted for public access at the URL provided in General Instructions above, under the title Frequently Asked Questions (FAQs). OTHER IMPORTANT INFORMATION: The Government reserves the right to select for award all, some, or none of the proposals received in response to this announcement and to award without discussions. All responsible sources may submit a proposal that shall be considered by DARPA. Small Disadvantaged Businesses and Historically Black Colleges and Universities (HBCUs)/Minority Institutions (MIs) are encouraged to submit proposals and join others in submitting proposals. However, no portion of this BAA will be set aside for HBCUs and MIs participation, due to the impracticality of reserving discrete or severable areas of technology for exclusive competition among these entities. Government contractors are required to register at the Government?s Central Contractor Registration site in order to negotiate contracts with most government agencies. This URL is provided as a reference: http://www.ccr.gov. Since this FedBizOpps Announcement, along with the PIP, constitutes a Broad Agency Announcement as contemplated in the FAR 6.102 (d)(2)(i), all prospective Proposers MUST also refer to the PIP before submitting a proposal. DARPA anticipates that contractor selections will be made during the first quarter of fiscal year 2005. Proposals MUST NOT be submitted by fax or e-mail; any so sent will be disregarded. The administrative addresses for this BAA are: Fax: 703-741-0081, addressed to: DARPA/IXO BAA 05-04. Electronic mail: BAA [email protected]. Electronic file retrieval: http://www.darpa.mil/IXO/Solicitations/dtttest/index.htm. Original Point of Contact: Michael Blackstone, Contracting Officer, Fax (703) 741-0081, email [email protected].
Federal Grant Title: Dynamic Tactical Targeting: Tactical Exercises and System Testing (DTT:TEST)
Federal Agency Name: Defense Advanced Research Projects Agency
Grant Categories: Science and Technology
Type of Opportunity: Discretionary
Funding Opportunity Number: BAA05-04
Type of Funding: Grant
CFDA Numbers: Information not provided
CFDA Descriptions: Information not provided
Current Application Deadline: No deadline provided
Original Application Deadline: Nov 29, 2004 Please note that registration for pr
Posted Date: Nov 03, 2004
Creation Date: Nov 03, 2004
Archive Date: Dec 31, 2004
Total Program Funding:
Maximum Federal Grant Award:
Minimum Federal Grant Award:
Expected Number of Awards:
Cost Sharing or Matching: No
Applicants Eligible for this Grant
Unrestricted (i.e., open to any type of entity above), subject to any clarification in text field entitled "Additional Information on Eligibility"
Link to Full Grant Announcement
Information not provided
Grant Announcement Contact
Schoen, Jennifer, Contracting Officer, Phone 703 696-2440, Fax 703-741-0602, Email [email protected] [email protected] Schoen, Jennifer

FederalGrants.com is not endorsed by, or affiliated with, any government agency. Copyright ©2007-2024 FederalGrants.com