Assessment Architecture
Security Assessment Plan (SAP)
System Name: Student Name
Section 1.1 - Provide a complete list of hardware consistent with the architecture diagram. List each asset/host individually by hostname or unique identifier.
Baseline Hardware List
Device Name (Unique Identifier) Manufacturer Model Number Firmware / OS Purpose Optional Field: might include fields such as Building and Room, IP, Approval Status (using DISA approved HW list, Common Criteria certification, etc.)
Router1 Cisco ISR 4221 15.5 Perimeter Router
*ADD ROWS AS NEEDED*
Section 1.2 - Provide testable software components, such as IA-enabled applications and operating systems.
Baseline Software List
Manufacturer Name Version Function Optional Field: might include fields such as Licence Expiration, Approval Status (using DISA approved SW list, Common Criteria certification, etc.)
Adobe Acrobat Pro Acrobat 19.010.20069 Document Creation
*ADD ROWS AS NEEDED*
Section 1.3 - Provide a copy of the architecture diagram and complete the assessment location fields. [Text] is provided as sample content only, replace with system-specific content.
Not required for Assignment: Architecture / Assessment Boundary Diagrams Assessment Location
Embed (or provide separately) a copy of the architecture diagram used to develop this SAP. Do not reference an architecture diagram uploaded in eMASS as eMASS artifacts can be changed over time and, if changed, may invalidate this SAP. Changes to the architecture diagram require an update to the SAP and may require additional SCA review and approval. Consult the assigned SCA Liaison as needed. Embed Diagram Here: Location(s) Environment Type (Dropdown)
Instructions: 1. Click on this cell 2. Choose 'Insert Object' 3. Use the 'Create From File' tab and locate file 4. Check the box 'Display as Icon' 5. Click 'OK'
Assessment Methods
RMF SAP Continued
Section 2.0 - Complete all fields in this tab, ensuring consistency with the 'Assessment Architecture' tab.
Section 2.1 - List each assessment method that will be executed as part of the Security Assessment Plan in the "Test Battery" column. List all hosts with the method that will be used to assess in the "Test Target" column; this field should include every target hostname, whenever applicable. Include the verification method that will be used by the validator and how the results/output will be captured in the corresponding fields.
Requirements Traceability
Test Battery Test Target (Component, Software, Technology, or Policy) Verification Method (E) Examine, (I) Interview, (T) Test Output
NIST SP 800-53A Rev4 Security Controls Assessment Procedures for L – L – L System E, I, T Procedures and results will be captured in spreadsheet for each applicable security control assessment procedure
Assured Compliance Assessment Solution (ACAS) Vulnerability scan(s) All assets T Results will be provided in nessus file
Traditional Security Technical Implementation Guide (STIG) system E,I,T STIG Viewer .ckl results will be provided
Enclave Testing Security Technical Implementation Guide (STIG) System E, I STIG Viewer .ckl results will be provided
Network Perimeter Router L3 Switch STIG - Ver 8, Rel 32 Router1 E, I, T STIG Viewer .ckl results will be provided
Firewall SRG - Ver 1, Rel 3 Firewall1 E, I, T STIG Viewer .ckl results will be provided
Network Layer 2 Switch STIG - Ver 8, Rel 27 Switch01, Switch02 E, I, T STIG Viewer .ckl results will be provided
*ADD ROWS AS NEEDED*
Assessment Personnel & Schedule
RMF SAP Continued
Section 3.0 - Complete all fields in this tab
Section 3.1 - Provide a list of assigned personnel.
Assessment Personnel
Title Name Telephone Email Address
Program Manager
Validator
Site/Program ISSM
ISSE
System Administrator
*ADD ROWS AS NEEDED*
Section 3.2 - Provide a schedule of assessment activities. [Text] is provided as sample content only, replace with system-specific content. Events can be modified as needed for each system and are provided only as suggestions.
Assessment Schedule
Date(s) Duration Event
Assessment Objectives Approval (Stakeholder concurrence)
SETUP & CONNECTIVITY
Configuration Verification
Assessment Procedures Walkthrough and finalization of assessment execution details
Setup and connectivity checks
ASSESSMENT EXECUTION
Assessment objective(s) under test (ex. Manual STIG checks of Windows 7 workstations for assessment procedure compliance)
ACAS scanning
Re-run tests as needed
Combine Manual and Automated results (if applicable)
Document raw results
ASSESSMENT REPORT
Synchronize results with the Risk Assessment Report
Identify False Positives and Misleading Results
Perform Gap Analysis
Execute any additional testing identified in Gap Analysis
Add vulnerabilities to Control Status
Populate SAR / Perform Risk Analysis
Update POA&M
Archive raw test data for submission
Exceptions
RMF SAP Continued
Section 4.0 - Review each question below and respond accordingly. Provide additional required information in column E as required.
Yes or No (dropdown) Exceptions in Testing Additional Information Required
Is there a relationship between this SAP and other plans and documents? If yes, describe the related plans (e.g., Master Test Plan, Continuous Monitoring Strategy, etc.) here.
Are there any Testing Limitations due to equipment, time, lab availability, system access, system admin availability, etc.? If yes, list them here. If discovered during the test event, document here at later date.
Are there any Related Tests being performed as part of this assessment event? If yes, describe related tests (e.g., Penetration testing, Web Risk Assessments) Documentation format will be as follows: a. Test Title b. Date conducted c. Related system being tested d. Responsible Organization e. Impact on testing for this system or product
Are there any Additional Test Considerations that need to be considered that have not previously been identified in this plan? If yes, describe here:
Are Custom Test Cases required to complete this assessment? If yes, describe the driver (e.g., the absence of an applicable STIG or SRG for a system under test) behind the test cases and the method in which they were generated (e.g., developed using vendor hardening guidance, best practices, or other references as applicable). The preferred format is the DISA STIG format. All custom test cases are grouped by technology. If a custom test case is required it must be traced to an applicable security control and have an assigned Severity Category based on criteria that shall also be documented here. All Custom test cases will be grouped into distinctive test batteries and referenced in the Requirements Traceability table.
Security Test Report
RMF SAP Continued
Section 5.0 - This report provides additional information useful for documenting test events and any conditions or exceptions realized during the event that may require further review.
Section 5.1 - Complete all fields adequately identifying conditions of a test case indicating non-compliance, but determined to be incorrectly marked as an open finding (ex. False Positive) [Text] is provided as sample content only, replace with system-specific content.
False Positives
Source of Discovery or Test Tool Name Test ID Description of Vulnerability/Weakness Comment Trouble Ticket #
[ACAS] [12345] [System must use NTFS] ACAS incorrectly identified multiple assets as not utilizing NTFS. Manual testing was performed and verified NTFS is correctly implemented and used. DISA ACAS Helpdesk TT#123456
ADD ROWS AS NEEDED
Section 5.2 - Complete all fields adequately identifying any misleading results. For detailed explanation, see reference tab.[Text] is provided as sample content only, replace with system-specific content.
Misleading Reports
Source of Discovery or Test Tool Name Test ID Description of Vulnerability/Weakness Comment
[EXAMPLE: 800-53 Rev4 Controls] [PE-3 CCI: 000919] [The organization enforces physical access authorizations at organization-defined entry/exit points to the facility where the information system resides.] [Assessor initially marked this non-compliant based on information received from site personnel prior to site visit. Further review with Command Security Manager onsite revealed compliance with this control, as physical access is controlled and documented appropriately.]
ADD ROWS AS NEEDED
Section 5.3 - Identify any exceptions to the assessment testing which occurred during the assessment in the box below. Provide a summary of the issue, background information and details of the exception.
Summary of Issue: Background:
References
This tab in INFORMATIONAL ONLY, and provides reference material and educational information related to security assessment planning.
References:
Reference documents used to support testing and prepare this plan include but are not limited to (Check for updated versions of each): a. DODI 8510.01, Risk Management Framework (RMF) for DoD Information Technology (IT), 12 March 2014 b. Department of Defense (DoD) Cybersecurity Risk Assessment Guide, 22 April 2014 c. NIST SP 800-30 Rev1, Guide for Conducting Risk Assessments, September 2012 d. NIST SP 800-53 Rev4, Security and Privacy Controls for Federal Information Systems and Organizations, April 2013 e. NIST SP 800-53A Rev1, Guide for Assessing the Security Controls in Federal Information Systems and Organizations, June 2010 h. DoDI 5000.02, Operation of the Defense Acquisition System, 7 January 2015 i. Risk Management Framework Process Guide, 4 August 2017 k. Test and Evaluation Master Plan (TEMP) or Master Test Plan (MTP)(if applicable) l. Governing Instructions (specific to system under test, if applicable)
Assessment Objectives:
The system is to be evaluated for compliance with the applicable NIST SP 800-53 security controls, in support of the Risk Management Framework (RMF). Any exceptions must be noted and reported in the test report, and results of non-compliance shall be recorded in the system Risk Assessment Report (RAR) for analysis and inclusion with the Security Assessment Report (SAR). The ongoing findings will then be documented in the Plan of Action and Milestones (POA&M) for future mitigation or remediation.
Architecture Diagram:
A network diagram can be provided as a separate artifact but is required to be included with SAP. This diagram serves as a snapshot in time representation of the network at the time of assessment. These should reflect the architecture that will be assessed and authorized (if applicable). Diagrams should clearly show connectivity and placement within the architecture. Each device shown should include IP address, Unique Identifier (ex. hostname), Operating System, and function. It should be possible to verify all network connections physically based on the diagrams provided. All interconnections with assets outside of the boundary should be clearly marked and include references to other authorizations. Out-of-band (OOB) management network connectivity should also be included, and references to authorization of the OOB network should be included if separate from this effort.
Verification Methods:
Examine: The process of reviewing, inspecting, observing, studying or analyzing one or more assessment objects (i.e., specifications, mechanisms, or activities Interview: The process of holding discussions with individuals or groups of individuals within an organization to facilitate assessor understanding, achieve clarification, or obtain evidence. Test: The process of exercising one or more assessment objects (i.e., activities or mechanisms) under specified conditions to compare actual with expected behavior.
Assessment Targets:
Component, Software, Technology, or Policy
Satisfactory/Unsatisfactory (Sat/Unsat) Criteria:
Establishing Satisfactory/Unsatisfactory (Sat/Unsat) Criteria- Prior to the commencement of formal assessment, the configuration, Satisfactory/Unsatisfactory (Sat/Unsat) criteria, and the execution process must be fully documented by the assessment team, and approved by the system stakeholders through formal collaboration after Step 2 of the RMF process. The Sat/Unsat criterion for cybersecurity assessment differs slightly from functional testing. The goal of cybersecurity testing is to adequately assess security features implemented or required for each system, regardless of disposition. The security assessment results in findings identified as ‘open’, ‘closed’, or ‘not applicable’ combined with an associated raw risk level that is used later in the residual risk analysis, as opposed to a pass/fail method. To determine the success or failure of a test, the assessor will conduct any required data reduction or analysis and compare the actual results with the expected result. If an ‘open’ finding exists, the assessor must also ensure that the test adequately characterizes the risk level of the security feature being tested. All results of assessment procedures are documented in the respective test tool reporting format. All ‘open’ findings are then documented in the Risk Assessment Report (RAR) for further analysis. The findings are further evaluated to determine false positives and misleading results. Remaining findings are then assigned a raw risk category. All findings shall have a residual risk analysis performed, for each respective finding, and documented in the RAR. The findings, along with the established residual risk, are then documented in the Security Assessment Report (SAR) for review and concurrence by the SCA.
Entrance/Exit Criteria:
Entrance criteria include all conditions required to be met prior to starting test. These include: a. Having completed the Assessment Architecture tab of this document b. Having satisfactorily passed all functional tests c. Having performed dry run(s) of assessment tests d. Having all assets under assessment available to the assessor(s) e. Having all requisite credential(s) and system access for assessment f. Having identified data transfer requirements for exporting vulnerability test data from system under assessment, if applicable Exit criteria include those items that must be met prior to leaving RMF Step 4, Assess Security Controls. These include: a. An established level of residual risk for all findings b. An accurate characterization of all mitigations for each finding in the RAR e. False positives and misleading results documented f. Risk aggregation analysis performed g. Risk Assessment Report (RAR) completed h. Security Assessment Report (SAR) populated with all ongoing findings i. Plan of Action and Milestones (POA&M) completed and updated j. An established system-level of residual risk
Logistics Support:
Resources, such as spare parts, documentation, transportation, training and the organizations providing them, may be required for assessment and must be planned for.
Qualifications/Certifications:
There may be several personnel qualifications required for RMF supporting roles, beyond the Validators. All qualifications and certifications should be reviewed by the PM/ISO before assigning personnel in this SAP.
Security:
The ISSM should address security to reinforce the importance of protecting classified material and ensuring the integrity of the assessment process.
False Positives:
False Positives are a condition of a test case indicating non-compliance, but after further analysis it is determined that the test case incorrectly marked the finding as open. This is normally associated with automated testing methods and arises due to a number of possibilities. It is important to single out these conditions, record them for further review and to provide feedback to test tool maintainers. This will increase overall accuracy of future test events, and reduce the amount of work necessary to address the same conditions when testing is performed at other points in a system’s lifecycle.
Misleading Results:
Misleading results are a condition that arises when the interpretation of an assessment procedure between testers is in conflict. This happens due to the complex nature of assessment procedures, a misunderstanding regarding the goal of the test case, or the use of a different method for determining compliancy status.
Misleading results are realized during further analysis of findings and a careful review of the test objective. Supporting details provided by testers will assist in the identification and resolution of these conditions. This will decrease efforts of future test events, and reduce the amount of work necessary to address the same conditions when testing is performed at other points in a system’s lifecycle.