Operations Research Systems Analyst

Fort Huachuca, AZ
Defense – Testing /
Regular /
On-site
Requisition #: 382
Job Title: Operations Research Systems Analyst
Location: 2001 Brainard RdFt Huachuca, Arizona85613
Clearance Level: Active DoD - Secret

Required Certification(s): 
·  Related certifications in the area of specialization may be substituted for 1 year of experience.  Examples of relevant Government or industry training include, but are not limited to, DAWIA Test and Evaluation Level 1/2/3 certification, ENG Level 1/2/3 certifications, CTEP certification, and applicable United States Federal or DoD school training and certification on a specific system, network, or technology. (See TEC II Services PWS section 6.4.4 for listing of technical skills, training, credentials, and experience.)

SUMMARY
Joint Interoperability Test Command (JITC) Test, Evaluation and Certification II Services (TEC II) contract provides Information Technology/National Security Systems (IT/NSS) T&E support and services that test and evaluate critical strategic and tactical IT/NSS systems and capabilities that support DoD day-to-day operations. JITC is one of DoD's Operational Test Agency (OTA) who is responsible for testing, evaluating, validating and certification of IT/NSS hardware and software intensive systems, capabilities and products directly supporting DoD's business and warfighting operations.

The Operations Research Systems Analyst (ORSA) candidate should have a good understanding of DoD Information Technology (IT) and how to conduct Operational Test and Evaluation (OT&E) of DoD IT/NSS systems and capabilities. The position requires the selected ORSA to be able to (using mathematical modeling, simulation, statistical design techniques) assist the Government customer in the development a Design of Experiment (DoE) and Science Based Test Design (SBTD) approach and methodology for software intensive capabilities and systems support DoD strategic business and tactical warfighting missions. The ORSA will help during the planning phase of the System Under Test (SUT) an approach and methodology that will provide the tester with the needed test result information so that key stakeholders can make the right fielding decisions. The candidate will be responsible for developing a data collection approach that will be used during the OT&E ensuring that SUT data is planned and collected during events and activities and that that data can be utilized amongst the different test requirement needs. The ORSA will require above average interpersonal written and verbal skills. Will need to be able to develop a data collection approach that will be used to test plan, execute, conduct the needed analysis to make Measures of Suitability (MOS), Measures of Effectiveness (MOE) and Measures of Performance (MOP) and Net Ready Key Performance Parameter ((NR KPP) determination of the IT/NSS SUT. Must be able to communicate, collaborate and coordinate in a dynamic and changing environment.


JOB DUTIES AND RESPONSIBILITIES
·         The ORSA will formulate and apply mathematical modeling, simulation, statistical design, and other optimizing methods using various software packages that have direct application to the T&E of IT/NSS. Acts as an expert in the application of Design of Experiment (DoE) Science Based Test Design (SBTD) to T&E disciplines in order to provide valuable tactical, operational, and strategic information to key program stakeholders, senior leaders and decision-makers of the SUT. The ORSA candidate will routinely meet with customer command group technical teams and other senior management to help ensure that their DoD customer testing programs/projects have testing methodologies that have been considered for integration of various SBTD procedures and processes including DoE, various operations research techniques, and statistical rigor designed for IT testing, and regression techniques. The ORSA will be expected to incorporate various DoE SBTD techniques and other statistical and analytical methods into C5ISR, AIS, and IT/NSS Test and Evaluation Master Plans (TEMPs) and other various test-planning documents. The ORSA will provide analysis and must be able to interpret SBTD results with respect to the overall system operational effectiveness and operational suitability. Identifies, develops, and analyzes technical and operational metrics to quantify effectiveness of operations to support the objectives and effects outlined in the T&E plan, including task and effects assessments. Assists test teams during test planning, test execution, data analysis, and formal report development. Provides expertise for defining problems, developing the analysis plan, gathering and reviewing data, constructing the model(s), testing and evaluation, analyzing the results, developing insights, and documenting the results. Recommends and implements processes that support quantitative and qualitative analysis, as appropriate. Formulates mathematical and/or simulation models of test problems in support of T&E of C5ISR, AIS, and IT/NSS, relating test constants and variables, restrictions, alternatives, constraints, assumptions, conflicting objectives, and numerical parameters. Integrates system T&E information from various sources for efficient multi-stage design, statistical modeling, and, as applicable, analyses including incorporation of applicable statistical confidence intervals and sample sizing into test methodologies. Analyzes discrepancies in T&E service or performance and makes recommendation for test conduct updates. Produces clear and concise data analysis and conducts trend analyses of various data. Compiles reports, charts, and tables based on established statistical methods. In support of integrated DoD IT/NSS OT&E efforts, the ORSA will assist test teams in the conduct of SUT, test planning, data collection analysis and reporting efforts for all test events. Use appropriate system documentation to perform requirements analysis and make sure sufficient information is known about any new capabilities or SUT. Assist test teams with successful planning of integrated Operational Test (OT) events will include the following; must be able to: update and develop a Data Source Matrix (DSM) for each T&E event, develop a formal SUT Risk Assessment (RA), prepare and provide a SUT Test Concept Brief (TCB), create and deliver a SUT Test Plan, create a T&E event Test Readiness Review (TRR) slides, and compile SUT Operational Evaluation Framework (DEF).

SUPERVISORY DUTIES
·         Provides input to staff involved in writing and updating technical documentation.  Provides guidance and work leadership to less-experienced T&E software, hardware and integrated system engineers, may serve as a technical team or task leader.  Provide consultation on complex projects and be a top-level contributor/specialist; must be expert at problem-solving, identifying risk, and communicating results and recommendations.

QUALIFICATIONS

Required Certifications
·     Related certifications in the area of specialization may be substituted for 1 year of experience. Examples of relevant Government or industry training include, but are not limited to, DAWIA Test and Evaluation Level 1/2/3 certification, ENG Level 1/2/3 certifications, CTEP certification, and applicable United States Federal or DoD school training and certification on a specific system, network, or technology. (See TEC II Services PWS section 6.4.4 for listing of technical skills, training, credentials, and experience.)

Education, Background, and Years of Experience
·         Requires a Doctorate and at least 4 years of experience; a Master’s degree and at least 10 years of experience; or a Bachelor’s and 16 years of experience.  The degree must be in a relevant technical curriculum and experience must be related to the job duties.  Relevant technical degrees include the following, though the Government may consider other curriculums, various engineering and science disciplines, Physics, Operations Research, Computer Science, and Mathematics.

ADDITIONAL SKILLS & QUALIFICATIONS

Required Skills
·         Ability to assist a team in the development/production of OT&E products and documents (E.g. Test Concept Briefs (TCB), Data Source Matrix (DSM), Test Plans, Quick Look Reports (QLR) and Final Report Memorandum) that clearly articulates the OT&E results and findings.

Preferred Skills
·         Have experience in the following IT/NSS T&E Domains:
·         -Development Test and Evaluation (DT&E)
·         -Operational Test and Evaluation (OT&E)
·         -Interoperability Test and Evaluation (IOP T&E)
·         -Cybersecurity Test and Evaluation (CS T&E)
·         -Integrated Test and Evaluation (Integrated T&E
         
·         Excellent organizational, coordination, interpersonal and team building skills.
·         Relies on extensive experience and judgment to plan and accomplish team goals.
          
·         Should have some experience or familiarity with at the following software IT/NSS T&E methodologies, process or tools.
·         -DevSecOps
·         -Data Modeling
·         - POWER BI or applicable analytical tools

WORKING CONDITIONS

Environmental Conditions
·         General office environment. Work is generally sedentary in nature, but may require standing and walking for up to 10% of the time. The working environment is generally favorable. Lighting and temperature are adequate, and there are not hazardous or unpleasant conditions caused by noise, dust, etc. Work is generally performed within an office environment, with standard office equipment available.
Strength Demands
·         Sedentary – 10 lbs. Maximum lifting, occasional lift/carry of small articles.  Some occasional walking or standing may be required.   Jobs are sedentary if walking and standing are required only occasionally, and all other sedentary criteria are met.
Physical Requirements
·         Stand or Sit; Walk; See