Loren Data's SAM Daily™

fbodaily.com
Home Today's SAM Search Archives Numbered Notes CBD Archives Subscribe
FBO DAILY - FEDBIZOPPS ISSUE OF MARCH 12, 2016 FBO #5223
SOURCES SOUGHT

70 -- Information for Army Exam and Survey Application (AESA)

Notice Date
3/10/2016
 
Notice Type
Sources Sought
 
NAICS
541512 — Computer Systems Design Services
 
Contracting Office
Department of the Army, Army Contracting Command, MICC, MICC - Fort Eustis (Joint Base Langley-Eustis), Building 705, Washington Blvd, Fort Eustis, Virginia, 23604-5538, United States
 
ZIP Code
23604-5538
 
Solicitation Number
W911S0-16-S-AESA
 
Point of Contact
Ronda D. Wilson, Phone: 7575018192, Cynthia L. Watson, Phone: 7575018121
 
E-Mail Address
ronda.d.wilson2.civ@mail.mil, cynthia.l.watson24.civ@mail.mil
(ronda.d.wilson2.civ@mail.mil, cynthia.l.watson24.civ@mail.mil)
 
Small Business Set-Aside
N/A
 
Description
Request for Information: W911S0-16-S-AESA PLEASE FORWARD CAPABILITY STATEMENT TO ronda.d.wilson2.civ@mail.mil, or cynthia.l.watson24.civ@mail.mil. Information for Army Exam and Survey Application THERE IS NO SOLICITATION AVAILABLE AT THIS TIME. REQUEST FOR A SOLICITATION WILL NOT RECEIVE A RESPONSE. This Sources Sought Synopsis is in support of Market Research being conducted by the Mission and Installation Contract Command, Fort Eustis to identify existing commercial software applications that will provide an Army Exam and Survey Application (AESA) to automate the management and administration of online exams and survey products. The responses to this market research will also determine if the AESA application can be acquired from a small business or if this effort should be an unrestricted procurement. This is not a request for proposal (RFP). Problem Statement: The Army uses web-based applications for Distributed Learning (DL). These applications automate course scheduling, registration, training administration, testing, record keeping, reporting, collaboration, course delivery, and other training support functions. Some of these applications also deliver online exams and surveys to students logged in. There are thousands of exams and associated sets of question banks, answer keys, and test items, including multiple versions of exams and end-of-course surveys, all of which must be controlled and managed. In addition, the Army needs to control the process for exam and survey development, testing, assembly, and publishing. These exams and surveys must undergo a technically complex development and testing process for assembly into tiered data structures across two (or more) automated hosting platforms. Currently more than seventy (70) different Army/DoD DL Producing organizations develop online exams and surveys, each organization needing its own secure organizational domain for development, testing, and storage of content. The Army requires a commercial software application to create, store, manage, administer, deliver, and produce analytics for exams and surveys to learners. AESA Desired Capabilities: The Army's intent is to acquire a COTS application with licenses to store, manage and administer all online exams and surveys using one centralized Web-based system that is integrated with existing training management systems. Students will use this system to take online exams and surveys as components of training/education curricula or as stand-alone training products supporting classroom instruction. Trainers will use the application to import individual question/answer combinations from other training systems or create and store exams directly through a user interface (UI). Instructors/proctors will use the application to remotely administer exams and surveys to students in either synchronous or asynchronous modes. Training managers will use the application to obtain pre-formatted or custom reports from a database composed of individual student exam/survey question responses. The Army will use the AESA to close a large capability gap in standardized online exam training management and administration at the enterprise level. The AESA should use an online enterprise approach to manage the lifecycle of exam content and related historical data including content development, content testing, exam updating, and student access to/completion of exams. The AESA should provide near-real time interfaces or data exchange with other Army training systems. The AESA must accept and process exam/survey registration data from external systems and provide status and completion data back to those systems. The AESA must exchange user identity, user authentication, learning object identity and related data, registration and completion data, etc. with other systems. A single software application is desired; however, an alternate solution may be comprised of a combination of applications that, working together, provide all or most of the features shown below. The COTS software should require no or minimal customization to meet the initial AESA functional requirements, however, should be customizable to support new functionality in the future. The AESA should: a. Be web-accessible to all users with no client-side plug-ins or applets required. b. Operate in the.com and.mil networks (Must be US Army Network Enterprise Technology Command (NETCOM) NetOps compliant). c. Operate as stand-alone or in combination with other automated business systems using data provided by them or providing data to them. d. Process validated direct user access requests (quote mark stand-alone quote mark mode) from external systems performing identification and authorization (I&A) when an access request is provided directly to the AESA. e. Process valid user exam or survey requests redirected to the AESA from external training systems. f. Store user account data including selected user profile elements and update these elements upon login. g. Control system access based on matching user access requests against known user accounts. h. Automatically create and store new accounts for authorized users. i. Comply with data protection laws, Army policies, and industry information assurance best practices. Online exam materials are considered to be sensitive but unclassified. The AESA will neither store nor otherwise handle classified information, but will store and distribute data that includes the For Official Use Only (FOUO) designation. j. Provide survey user identity anonymity at the reporting level. k. Meet requirements for (or already have) an Army Certificate of Networthiness (CON) and Army Approval to Operate (ATO) in the Non-secure Internet Protocol Router Network (NIPRNET) l. Track workflow status across the lifecycle of the exam or survey, including content development and testing, fielding and delivery, content revision, recoding or other sustainment actions, retirement, and archiving. m. Provide import/upload functionalities for common media file types to include jpg, wav, mpeg, etc. n. Resume functions after interruption without loss of data. o. Capable of delivering exams and surveys to, and obtain responses/results from, Android, iOS, or Windows mobile devices. p. Employ a range of question parameters, including weighting, sub-setting, partial crediting. q. Support organization of exam questions into sub-sets up to two levels. r. Establish and maintain searchable associations between question/answer combinations and learning objectives. s. Insert objects, composed of common types of media files, into exams or individual question/answer combinations. t. Provide built-in user interfaces for exam development, administration and delivery (exam/survey-taking environment). u. Assign and maintain user roles as part of profile data for use in controlling system permissions, access privileges, etc. v. Restrict access of exams, exam components, and related materials to users authorized by system-enforced security/job roles and organizational domains. w. Provide configurable controls to limit student attempts on exams. x. Establish configurable controls to specify exam performance feedback detail at the learning object level. y. Provide aggregated reports at the individual exam response level that support quote mark test-item analysis quote mark. z. Support UI searches based on user name, unique identifier, questions, learning objective, exam or survey title/version number, organization/domain, or date range. aa. Provide configurable audit capability (user, date/time, action, reason) for specified objects and fields (create/edit/delete activities.) bb. Associate (assign) users with specific exam(s) from the UI or in bulk via import. The AESA should support: a. Various/multiple types of question formats, such as hot spot, drag and drop, matching, multiple choice/one or more correct, True/False, short answer/essay. b. Upload and storage of student files, such as written assignments in response to exam requirements, within specified time limits. c. Electronic signatures from learners in support of student assessment verification. d. Optional bookmarking during an exam session with re-authentication upon resumption. e. Reuse of stored exam/survey questions/response combinations. f. User-defined multi-level organization of questions and assessments into exam sections with sectional scoring against configurable mastery score definitions and reporting of overall results and results by section. g. Custom quote mark branding quote mark of the UI for exam or survey presentation windows at organizational level. h. Knowledge-based test question types described in TRADOC Pam 350-70-5 (http://www.tradoc.army.mil/tpubs/pams/p350-70-5.pdf). i. Import (bulk) and manual input of individual exam results from external assessments. j. Pre-formatted and customizable report templates. k. Designation of a proctor role with associated permissions such as testing with live face-to-face proctors or testing with a remote proctor using webcam and other authenticating devices. l. Establishment of durations for individual exams, set/control actual exam start/stop times, and allow reset of testing time limits by instructors/proctors. m. Multiple instances of the same scheduled exam and user selection of specific exam events by location and date/ time. Responses: Responses should show how the product(s) would satisfy the functionality shown above, and identify where configuration and/or customization may be needed, with an indication of the degree of effort that may be needed for each customization (minor, moderate or major). Responses should provide an explanation of how legacy SCORM-compliant exam and survey content could be migrated into the product(s) software and used. Responses should identify whether the identified product(s) have capability to accept, store, execute, interpret or score SCORM content, and any limitation on SCORM version. There is a 20-page limitation on technical information responses. Responses must be electronic documents in a format readable and usable by Microsoft (MS) Word 2007 (12 point font), or in PDF format viewable with the standard Adobe Acrobat Reader. In addition to the written response, upon the request of and at no cost to the Government, responders should be able to provide a demonstration of the basic product(s) that underlie their response. Virtual demonstrations that allow for concurrent live discussion with the responder will be accepted. Proprietary Data: All information submittals containing proprietary data must be appropriately marked. It is the respondent's responsibility to clearly define to the Government what is considered to be proprietary data. Responders are advised that any data submitted to the Government in response to this RFI will be released to non-Government advisors for review and analysis. Any non- Government advisor will strictly serve in an advisory capacity to the Government. No basis for claim against the Government shall arise as a result of a response to this SOURCES SOUGHT or Government use of any information provided. Elaborate proposals or pricing information are neither required nor desired. Any innovative and new conceptual ideas to achieve the stated objective are encouraged. No solicitation document exists for this request for Sources Sought. This Request for Information is published in accordance with FAR Part 15.201(e), and is for PLANNING PURPOSES ONLY. The Government will not pay for any effort expended into responding to this sources sought announcement. The Government does not intend to award a contract based on this RFI, or reimburse the costs incurred by providing the information requested under this notice. This RFI is open to any capable and qualified commercial source, and partnering is encouraged. Response must be submitted and received no later than 5:00 PM EST on 24 March 2016 via email to ronda.d.wilson2.civ@mail.mil, or cynthia.l.watson24.civ@mail.mil. Place of Performance: Distributed Learning Systems (DLS), 11846-B Rock Landing Drive, Newport News, VA 23606. Compliancy requirements are updated frequently. Army network security information can be found at http://www.army.mil/netcom and additional links that can be accessed from there. Contracting Office Address: MICC Center - Fort Eustis (Joint Base Langley-Eustis), 705 Washington Blvd., Ste 126, Fort Eustis, VA 23604-5538. Point of Contacts: Cynthia Watson, 757-501-8121, or Ronda Wilson, 757-501-8192.
 
Web Link
FBO.gov Permalink
(https://www.fbo.gov/notices/810d9b74620fd7758099fec8120d1664)
 
Place of Performance
Address: Distributed Learning Systems (DLS), 11846-B Rock Landing Drive, Newport News, Virginia, 23606, United States
Zip Code: 23606
 
Record
SN04047206-W 20160312/160311000638-810d9b74620fd7758099fec8120d1664 (fbodaily.com)
 
Source
FedBizOpps Link to This Notice
(may not be valid after Archive Date)

FSG Index  |  This Issue's Index  |  Today's FBO Daily Index Page |
ECGrid: EDI VAN Interconnect ECGridOS: EDI Web Services Interconnect API Government Data Publications CBDDisk Subscribers
 Privacy Policy  Jenny in Wanderland!  © 1994-2024, Loren Data Corp.