Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY ISSUE OF NOVEMBER 27, 2009 FBO #2925
MODIFICATION

A -- Software and Systems Test Track

Notice Date
11/25/2009
 
Notice Type
Modification/Amendment
 
NAICS
541712 — Research and Development in the Physical, Engineering, and Life Sciences (except Biotechnology)
 
Contracting Office
Department of the Air Force, Air Force Materiel Command, AFRL - Rome Research Site, AFRL/Information Directorate 26 Electronic Parkway, Rome, NY, 13441-4514
 
ZIP Code
13441-4514
 
Solicitation Number
Reference-Number-BAA-06-13-IFKA
 
Point of Contact
Lynn G. White, Phone: (315) 330-4996
 
E-Mail Address
whitel@rl.af.mil
(whitel@rl.af.mil)
 
Small Business Set-Aside
N/A
 
Description
The purpose of this modification is to republish the original announcement pursuant to FAR 35.016(c). This republishing also includes the following changes: (a) Delete section I. Funding Opportunities Description - Research Concentration Area - Phase 2 and is replaced by Research Concentration Area - Phase 3; (b) Add SPRUCE reference to section I. Funding Opportunities Description - References; (c) Modify section I. Anticipated Schedule to reflect Phase 3; (d) Modify section II. Award Information to include Phase 3 awards; (e) Modify Section V. Application Review Information - 1. Criteria - (1) Overall Scientific and Technical Merit B and C as well as (2) Related Experience. No other changes have been made. NAICS CODE: 541712 FEDERAL AGENCY NAME: Department of the Air Force, Air Force Materiel Command, AFRL - Rome Research Site, AFRL/Information Directorate, 26 Electronic Parkway, Rome, NY, 13441-4514 TITLE: Software and Systems Test Track ANNOUNCEMENT TYPE: Initial announcement FUNDING OPPORTUNITY NUMBER: BAA #06-13-IFKA CFDA Number: 12.800 DATES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 07 by 31 May 2007; FY 08 by 31 December 2007; FY 09 by 31 December 2008 and, FY 10 by 31 December 2009. White papers will be accepted until 2:00 p.m. Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. FORMAL PROPOSALS ARE NOT BEING REQUESTED AT THIS TIME. See Section IV of this announcement for further details. I. FUNDING OPPORTUNITY DESCRIPTION: Background: It is increasingly difficult to create software to deal successfully with increased device and system complexity. Moreover, networked distributed systems create vast increases in the scale and scope of information processing applications, exacerbating the challenges to the system engineers' ability to specify, design, build, verify and test software. This situation is an emerging issue in information technology in general, but the requirement of military systems set them sharply apart from non-military applications in terms of reliability, robustness, security, interoperability and real time operation. Business, government, and technical endeavors ranging from financial transactions to space missions increasingly require complex software systems to function correctly. The complexity of the software arises from stringent requirements (e.g., for reliability in performance and integrity of the data used), the need to support a range of interactions with the environment in real time, and/or certain structural features. These attributes make software difficult to produce. Major hardware-software failures in defense acquisition programs have occurred both because components were expected to interoperate properly and they did not, as well as the fact that tools did not work as advertised. Interoperability is required for network centric environments but reality has shown it very difficult to achieve. A scalable, flexible, realistic synthetic testing environment is required for stressing tools against key benchmarks, assessment of the utility of tools by key program offices as well as use as a synthetic environment for testing tools against large systems (tens of millions of SLOC or larger) or systems-of-systems. Software intensive systems are critical to meeting demands of current and future warfighting capabilities. The costs of development of these systems -- in terms of dollars, time and human resources are increasing rapidly. • Approximately 40% of DoD's RTD&E budget is currently spent on software development, and this is expected to increase. • "The (Army FCS) software task alone is five times larger than that required for Joint Strike Fighter and ten times larger than the F-22, which after two decades is finally meeting its software requirements." -- Congressman Curt Weldon (April, 2004) The complexity of current and emerging systems and systems of systems (SoS) requires breakthrough approaches in system development including new tools and methodologies spanning the entire system lifecycle from architecture design to system and SoS verification. One of the risk factors affecting the transitionability of such research is the difficulty of validating and verifying the functionality of research technologies against realistic problems. The poor collaboration among people working across the technology maturity lifecycle has created a "valley of disappointment" where DoD programs fail to adopt advanced technologies, regardless of their inherent promise. A regime of ad hoc policies and procedures for transitioning software research into software practice in avionics and other domains has arisen for technology transitioning. The expectation is that promising work funded by one organization early in the lifecycle, e.g., a DARPA program focusing on Science and Technology, will be perpetuated by another organization in the following phase, e.g., an AFRL program on Research & Engineering. The DoD has over 100 separate organizations engaged in technology transfer. The current strategy and the number of organizations involved makes it hard for knowledgeable people across the Technology Readiness Lifecycle - customers, researchers, engineers, and operators - to collaborate and plan for transition. Program Engineers typically view new development tools and runtime platforms as risky due to: • Insufficient evidence to prove their capabilities can payoff in production environments, • Immature prototypes that lack stability, features, user support, training, and technology-to-tool chain integrations, and • The absence of affordable and sustainable long-term commercial support plans. Software-Intensive Systems Producibility researchers typically assume others in government or commercial market place will address these problems. It is the intent of Systems and Software Test Track to help address all of the above issues. Objective: The overall objective of this Systems and Software Test Track (SSTT) BAA program is to provide an open framework environment where an assortment of experimental tools and products may be deployed and allowed to interact in real-time, interactive, evolutionary and interdependent means, thereby allowing rigorous testing of new technologies, methodologies and theories in support of the Software-Intensive Systems Producibility Initiative (SISPI). The Systems and Software Test Track will facilitate testing of Software-Intensive Systems Producibility research products and methods, provide an environment for research of DoD embedded systems and software problems, provide an ability for university and industry leverage of technology development, and establish a capability for successful technology transition and transfer. The SSTT is intended to be an open collaborative research and development environment to demonstrate, evaluate, and document the ability of novel tools, methods, techniques, and technologies to yield affordable and more predictable production of software intensive systems. The SSTT system will bring together researchers, developers, and domain experts from different communities to de-fragment the knowledge necessary to achieve SISPI research, development and technology transition. The SSTT will be a system where SISPI researchers can test their research against relevant challenge problems, and where independent analysis of SISPI research can be preformed. This independent analysis would enable the SSTT to support acquisition program offices and analyze utility of tools. Also, SSTT users can bring their unsolved problems to provide challenges that drive SiSPI research where no such tools are available, or search for a solution by leveraging existing capabilities available in the SSTT. The SSTT will have several logical roles of collaborating participants: Challenge Problem Provider (for providing challenge problems), Candidate Solution Provider (for providing candidate solutions to challenge problems), Experimenter (for running experiments and collecting experimental results), Collaborators (involved in collaborating on challenge problems, candidate solutions, and/or experiments), Solution Inquirer (actively searching for a needed capability) and SSTT Administrators (responsible for administering the SSTT infrastructure). Various community participants may play one or more of these logical roles. SSTT will enable Program Engineers from DoD Acquisition Programs to interact with SISPI Researchers from Industry Labs or Universities to define, discover and evaluate SISPI technology. Phase 1 consisted of defining, developing, and documenting the Concept of Operations (CONOPS), a user-oriented document that describes system characteristics for a proposed system from the users' viewpoint; and defining, developing and documenting the architecture and the fundamental organization of the Systems and Software Test Track as embodied in its components, their relationships to each other and the environment, and the principles governing its design and evolution. Initial concepts along with the final CONOPS and architectures were presented to representatives from Government, Industry and Academia. Two awards were made and all documentation is available on the BCSW server (please contact steven.drager@rl.af.mil to request access). The CONOPS is posted on FedBizOpps along with this modification 1. Phase II is the environmental development and operations phase. The environment is open and available for use by developers as well as independent analysis by the facility operators. This independent analysis allows the facility operators to be supportive to major defense acquisition program offices as well as analyzing the utility of tools. Program offices may bring their unsolved problems to the test track for either help in solving or by looking for needed utility amongst the tools available. Lastly, this synthetic environment should provide a place where big codes can be tested allowing requirements verification prior to production and deployment. This risk reduction affords the ability to verify and validate functionality of today's complex software intensive systems while providing a realistic environment for researchers to verify their tools against realistic problems. The Systems and Software Test Track provides a place for experimental verification of Software-Intensive Systems Producibility technologies due to their novelty and the potential complexity of the underlying theories. The experimental platforms incorporate software technology to instrument, monitor and test large-scale applications. Challenge problems for the open experimental platforms are accessible for all the research teams. The experimental platform research includes subtasks to conduct large-scale coordination experiments, and to develop methods and tools for evaluating aggregate performance of applications. This environment provides a full range of collaborative technology challenges, run-time platforms and applications, experiments, evaluations, and demonstrations. A Common infrastructure will enable control and data flow between both kinds of application components for a distributed environment. The open experimentation environment will provide the fundamental reference architecture and underpinnings helping researchers to develop and test their designs as well as facilitates transition of promising technologies into production use. Research Concentration Areas: The primary goal of Phase 3 is to research, design and experiment within the SSTT implementation, called SPRUCE. This involves development of challenge problems and artifacts, and/or solutions to challenge problems, and/or experiments which validate solution of challenge problems. Areas to consider for challenge problems, solutions and experiments include providing software artifacts, benchmarks, executables, source code, design documents, requirements documents, examples, models, fault data, lessons learned, software construction files and tools but are not limited to the technologies listed below: • Challenges problems, solutions and experiments currently listed in SPRUCE • Provably Correct Code Generation and Automatic Software/System Analysis • Software and Systems Composability • Guarantee System Interoperability • Provably Trusted Components (COTS/GOTS/Hardware) • Model-Based Development for Predictable Software Attributes • Predicting the "-ilities" Early in the Life Cycle • Emerging Technologies' Affect on Software • Automatic Software Optimization (Performance, Power Consumption,...) • Software Conversion to New Technology (Multi-Core, Cell,...) • Fighting through Software Failures • Understanding Software • Cyber Critical Software (variant of flight critical software) • Measurement Techniques and Software Forensics References: Institute for Electrical and Electronics Engineers, IEEE Guide for Information Technology-System Definition-Concept of Operations (CONOPS) Document. IEEE Std 1362-1998, IEEE Computer Society Press, 1998. Systems and Software Producibility Collaboration and Experimentation Environment (SPRUCE) https://www.sprucecommunity.org Anticipated Schedule: Phase 3, the Research and Operations phase, is expected to begin upon the beta release of Phase 2. This modification of the BAA initiates Phase 3. II. AWARD INFORMATION: Total funding for this BAA is approximately $18M. The anticipated funding to be obligated under this BAA is broken out by fiscal year as follows: FY 06 - $1.0M; FY 07 - $1.0M; FY 08 - $6.1M; FY 09 - $4.9M; and FY10 - $5.0M. Individual awards will not normally exceed 6 months with dollar amounts ranging between $300K to $400K per year for Phase 1 and will not normally exceed 18 months with dollar amounts ranging between $700K and $3.0M per year for Phase II. Phase III awards will not normally exceed 12 months with dollar amounts ranging between $100K and $750K. Awards of efforts as a result of this announcement will be in the form of contracts, grants, cooperative agreements, or other transactions depending upon the nature of the work proposed. III. ELIGIBILITY INFORMATION: 1. ELIGIBLE APPLICANTS: All potential applicants are eligible. Foreign allied participation is authorized at the prime contractor level. Foreign allied participation is allowed of the following countries: France, Germany, Greece, Israel, Italy, Luxembourg, Netherlands, Australia, Austria, Belgium, Canada, Denmark, Egypt, Finland, Norway, Portugal, Spain, Sweden, Switzerland, Turkey and United Kingdom. 2. COST SHARING OR MATCHING: Cost sharing is not a requirement. IV. APPLICATION AND SUBMISSION INFORMATION: 1. APPLICATION PACKAGE: THIS ANNOUNCEMENT CONSTITUTES THE ONLY SOLICITATION. WE ARE SOLICITING WHITE PAPERS ONLY. DO NOT SUBMIT A FORMAL PROPOSAL AT THIS TIME. Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. See Section VI of this announcement for further details. For additional information, a copy of the AFRL/Rome Research Sites "Broad Agency Announcement (BAA): A Guide for Industry," April 2007, may be accessed at: http://www.fbo.gov/spg/USAF/AFMC/AFRLRRS/Reference%2DNumber%2DBAAGUIDE/listing.html 2. CONTENT AND FORM OF SUBMISSION: Offerors are required to submit an electronic copy of a NOT to exceed 10 page white paper summarizing their proposed approach/solution. The purpose of the white paper is to preclude unwarranted effort on the part of an offeror whose proposed work is not of interest to the Government. The white paper will be formatted as follows: Page 1 - Section A: Title, Period of Performance, Estimated Cost of Task, Name/Address of Company, Technical and Contracting Points of Contact (phone, fax and email); Pages 2 to not more than 10 - Section B: Task Objective, and Section C: Technical Summary and Proposed Deliverables. Multiple white papers within the purview of this announcement may be submitted by each offeror. If the offeror wishes to restrict its white papers/proposals, they must be marked with the restrictive language stated in FAR 15.609(a) and (b). All white papers/proposals shall be double spaced with a font no smaller than 12 pitch. In addition, respondents are requested to provide their Commercial and Government Entity (CAGE) number, a fax number, and an e-mail address with their submission. All responses to this announcement must be addressed to the technical POC, as discussed in paragraph five of this section. 3. SUBMISSION DATES AND TIMES: It is recommended that white papers be received by the following dates to maximize the possibility of award: FY 07 by 31 May 2007; FY 08 by 31 December 2007; FY 09 by 31 December 2008 and, FY 10 by 31 December 2009. White papers will be accepted until 2pm Eastern time on 30 March 2010, but it is less likely that funding will be available in each respective fiscal year after the dates cited. Submission of white papers will be regulated in accordance with FAR 15.208. 4. FUNDING RESTRICTIONS: The cost of preparing white papers/proposals in response to this announcement is not considered an allowable direct charge to any resulting contract or any other contract, but may be an allowable expense to the normal bid and proposal indirect cost specified in FAR 31.205-18. Incurring pre-award costs for ASSISTANCE INSTRUMENTS ONLY, are regulated by the DoD Grant and Agreements Regulations (DODGARS). 5. OTHER SUBMISSION REQUIREMENTS: DO NOT send white papers to the Contracting Officer. All responses to this announcement must be addressed to: Steven.Drager@rl.af.mil Attn: Mr. Steven Drager Respondents are required to provide their Dun & Bradstreet (D&B) Data Universal Numbering System (DUNS) number with their submittal and reference BAA 06-13-IFKA. V. APPLICATION REVIEW INFORMATION: 1. CRITERIA: The following criteria, which are listed in descending order of importance, will be used to determine whether white papers and proposals submitted are consistent with the intent of this BAA and of interest to the Government: (1) Overall Scientific and Technical Merit -- Including the following: a) Clear definition and development of the technical concepts b) Technical excellence, creativity and innovation of the development and experimentation approach and the definition quality of proposed challenge problems or experiments. c) Evidence of research acumen in integration and development of the challenge problem and/or experiment. (2) Related Experience - The extent to which the offeror demonstrates relevant technical knowledge and competency and necessary software technologies domain knowledge as well as the diversity and quality of the team (balance of skills set, excellence of academic-industry team members, and proven capability to conduct innovative practical research), and the strength of insight into program offices (3) Maturity of Solution - The extent to which existing capabilities and standards are leveraged, the relative maturity of the proposed technology in terms of reliability and robustness, and previous demonstration of capability to accomplish the tasks. (4) Collaboration - The extent to which the offeror demonstrates commitment and detailed plans for collaboration among all interested and relevant parties in the SSTT. How researchers will collaborate within the SSTT and plans for sharing results from the SSTT with the broader community and encouraging participation of team and non-team members alike in evaluations and information dissemination. (5) Reasonableness and realism of proposed costs and fees (if any). No further evaluation criteria will be used in selecting white papers/proposals. Individual white paper/proposal evaluations will be evaluated against the evaluation criteria without regard to other white papers and proposals submitted under this BAA. White papers and proposals submitted will be evaluated as they are received. 2. REVIEW AND SELECTION PROCESS: Only Government employees will evaluate the white papers/proposals for selection. The Air Force Research Laboratory's Information Directorate has contracted for various business and staff support services, some of which require contractors to obtain administrative access to proprietary information submitted by other contractors. Administrative access is defined as "handling or having physical control over information for the sole purpose of accomplishing the administrative functions specified in the administrative support contract, which do not require the review, reading, or comprehension of the content of the information on the part of non-technical professionals assigned to accomplish the specified administrative tasks." These contractors have signed general non-disclosure agreements and organizational conflict of interest statements. The required administrative access will be granted to non-technical professionals. Examples of the administrative tasks performed include: a. Assembling and organizing information for R&D case files; b. Accessing library files for use by government personnel; and c. Handling and administration of proposals, contracts, contract funding and queries. Any objection to administrative access must be in writing to the Contracting Officer and shall include a detailed statement of the basis for the objection. VI. AWARD ADMINISTRATION INFORMATION: 1. AWARD NOTICES: Those white papers found to be consistent with the intent of this BAA may be invited to submit a technical and cost proposal. Notification by email will be sent by the technical POC. Such invitation does not assure that the submitting organization will be awarded a contract. Those white papers not selected to submit a proposal will be notified in the same manner. Prospective offerors are advised that only Contracting Officers are legally authorized to commit the Government. All offerors submitting white papers will be contacted by the technical POC, referenced in Section VII of this announcement. Offerors can email the technical POC for status of their white paper/proposal no earlier than 45 days after proposal submission. 2. ADMINISTRATIVE AND NATIONAL POLICY REQUIREMENTS: Depending on the work to be performed, the offeror may require a TOP SECRET facility clearance and safeguarding capability; therefore, personnel identified for assignment to a classified effort must be cleared for access to TOP SECRET information at the time of award. In addition, the offeror may be required to have, or have access to, a certified and Government-approved facility to support work under this BAA. Data subject to export control constraints may be involved and only firms holding certification under the US/Canada Joint Certification Program (JCP) (www.dlis.dla.mil/jcp) are allowed access to such data. The technical data rights proposed must be consistent with the requirements that the Software and Systems Test Track is a resource available to the community at large. The interface specifications will be made available to all parties having their requests for access granted by the Government. 3. REPORTING: Once a proposal has been selected for award, offeror's will be required to submit their reporting requirement through one of our web-based, reporting systems known as JIFFY or TFIMS. Prior to award, the offeror will be notified which reporting system they are to use, and will be given complete instructions regarding its use. VII. AGENCY CONTACTS: Questions of a technical nature shall be directed to the cognizant technical point of contact, as specified below: TPOC Name: Steven Drager Telephone: (315) 330-2735 Email: Steven.Drager@rl.af.mil Questions of a contractual/business nature shall be directed to the cognizant contracting officer, as specified below: Lynn White Telephone (315) 330-4996 Email: Lynn.White@rl.af.mil The email must reference the solicitation (BAA) number and title of the acquisition. In accordance with AFFARS 5315.90, an Ombudsman has been appointed to hear and facilitate the resolution of concerns from offerors, potential offerors, and others for this acquisition announcement. Before consulting with an ombudsman, interested parties must first address their concerns, issues, disagreements, and/or recommendations to the contracting officer for resolution. AFFARS Clause 5352.201-9101 Ombudsman (Aug 2005) will be incorporated into all contracts awarded under this BAA. The AFRL Ombudsman is as follows: Susan Hunter Building 15, Room 225 1864 Fourth Street Wright-Patterson AFB OH 45433-7130 FAX: (937) 225-5036; Comm: (937) 255-7754 All responsible organizations may submit a white paper which shall be considered.
 
Web Link
FBO.gov Permalink
(https://www.fbo.gov/spg/USAF/AFMC/AFRLRRS/Reference-Number-BAA-06-13-IFKA/listing.html)
 
Record
SN02012193-W 20091127/091125235652-355fbd80a0d621c61d5fe6f93cb13cca (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.