Implementing a Mapper
Mapper Introduction
OHDF Converters Utilities
Special utilities from OHDF Converters are used here. Refer here for a refresher on OHDF Converters.
Now that we have developed a mapping, we can implement that mapping as a mapper which applies the mapping to any compatible file for a conversion to OHDF.
File Set Up
A number of crucial files that support and provide the infrastructure needed for the *-to-OHDF mapper have to be set up first before we begin actual mapper development.
Specialized Security Tools
This guide is geared for security tools that provide scan-based export data. If your security tool provides a specialized form of export data or is an API, contact the SAF team for further guidance.
Mapper File
First, we need to create the file that hosts the mapper and link to it so other files in OHDF Converters can access it.
- Create a blank TypeScript file under the
srcdirectory inhdf-converters. It should be named:
{YOUR-EXPORT-NAME-HERE}-mapper.ts- Select the appropriate mapper skeleton (see below) for your export type. Place them in the file created in step 1. Replace names (
SKELETONby default) as necessary.
JSON Mapper Skeleton
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
export class SKELETONMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
target_id: null, //Insert data
},
version: HeimdallToolsVersion,
statistics: {
duration: null, //Insert data
},
profiles: [
{
name: "", //Insert data
title: null, //Insert data
version: null, //Insert data
maintainer: null, //Insert data
summary: null, //Insert data
license: null, //Insert data
copyright: null, //Insert data
copyright_email: null, //Insert data
supports: [], //Insert data
attributes: [], //Insert data
depends: [], //Insert data
groups: [], //Insert data
status: "loaded", //Insert data
controls: [
{
key: "id",
tags: {}, //Insert data
descriptions: [], //Insert data
refs: [], //Insert data
source_location: {}, //Insert data
title: null, //Insert data
id: "", //Insert data
desc: null, //Insert data
impact: 0, //Insert data
code: null, //Insert data
results: [
{
status: ExecJSON.ControlResultStatus.Failed, //Insert data
code_desc: "", //Insert data
message: null, //Insert data
run_time: null, //Insert data
start_time: "", //Insert data
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [{ name: "", data: _.omit([]) }], //Insert service name and mapped fields to be removed
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}- Export your mapper class created in the previous steps by specifying its export in the index.ts file. Add the following line:
export * from './src/{YOUR-EXPORT-NAME-HERE}-mapper';Sample File
Next, we need to add a sample file for the mapper to ingest when running unit tests on it.
- Create a new directory named
{YOUR-EXPORT-NAME-HERE}_mapperunder thesample_jsonsdirectory inhdf-converters. Create another directory namedsample_input_reportin the directory you just made. The directory structure should look like this:
+-- sample_jsons
| +-- {YOUR-EXPORT-NAME-HERE}-mapper
| | +-- sample_input_report- Place your sample export under the
sample_input_reportdirectory. Your sample export should be genericized to avoid any leaking of sensitive information. The directory structure should now look like this:
+-- sample_jsons
| +-- {YOUR-EXPORT-NAME-HERE}-mapper
| | +-- sample_input_report
| | | +-- {YOUR-SAMPLE-EXPORT}Regression Testing
Now that we have a sample file, we can now add some regression tests which automatically test our mapper to ensure that it is producing readable and correct OHDF file outputs.
- Create a blank TypeScript file under the
test/mappers/forwarddirectory inhdf-converters. It should be named:
{YOUR-EXPORT-NAME-HERE}_mapper.spec.ts- Select the appropriate mapper test skeleton (see below) for your export type. Place it in the file created in step 1. Replace names (
SKELETONby default) as necessary.
JSON Mapper Test Skeleton
import fs from "fs";
import { SKELETONMapper } from "../../../src/SKELETON-mapper";
import { omitVersions } from "../../utils";
describe("SKELETON_mapper", () => {
it("Successfully converts SKELETON targeted at a local/cloned repository data", () => {
const mapper = new SKELETONMapper(
fs.readFileSync(
"sample_jsons/SKELETON_mapper/sample_input_report/SKELETON.json",
{ encoding: "utf-8" }
)
);
// fs.writeFileSync(
// 'sample_jsons/SKELETON_mapper/SKELETON-hdf.json',
// JSON.stringify(mapper.toHdf(), null, 2)
// );
expect(omitVersions(mapper.toHdf())).toEqual(
omitVersions(
JSON.parse(
fs.readFileSync("sample_jsons/SKELETON_mapper/SKELETON-hdf.json", {
encoding: "utf-8",
})
)
)
);
});
});
describe("SKELETON_mapper_withraw", () => {
it("Successfully converts withraw flagged SKELETON targeted at a local/cloned repository data", () => {
const mapper = new SKELETONMapper(
fs.readFileSync(
"sample_jsons/SKELETON_mapper/sample_input_report/SKELETON.json",
{ encoding: "utf-8" }
),
true
);
// fs.writeFileSync(
// 'sample_jsons/SKELETON_mapper/SKELETON-hdf-withraw.json',
// JSON.stringify(mapper.toHdf(), null, 2)
// );
expect(omitVersions(mapper.toHdf())).toEqual(
omitVersions(
JSON.parse(
fs.readFileSync(
"sample_jsons/SKELETON_mapper/SKELETON-hdf-withraw.json",
{
encoding: "utf-8",
}
)
)
)
);
});
});Fingerprinting
OHDF Converters has a fingerprinting service that detects a security tool data format and automatically applies the correct mapper to convert it to OHDF. To enable this feature, we need to explicitly declare keywords unique to the security tool data format as follows:
Go to the file
report_intake.tsunder theheimdall2/apps/frontend/src/storedirectory.Import your mapper file. You should be able to add the name of your mapper class to a pre-existing import statement pointing at
hdf-convertersas follows:
import {
ASFFResults as ASFFResultsMapper,
BurpSuiteMapper,
...
{YOUR-MAPPER-CLASS-HERE}
} from '@mitre/hdf-converters';- Instantiate your mapper class in the
convertToHdfswitch block. Add the following lines:
case '{YOUR-EXPORT-SERVICE-NAME-HERE}':
return new {YOUR-MAPPER-CLASS-HERE}(convertOptions.data).toHdf();- Navigate to the file
fingerprinting.tsin thesrc/utilsdirectory inhdf-converters. Add keywords that are unique to your sample export to thefileTypeFingerprintsvariable. It should be formatted as follows:
export const fileTypeFingerprints = {
asff: ['Findings', 'AwsAccountId', 'ProductArn'],
...
{YOUR-EXPORT-SERVICE-NAME-HERE}: [{UNIQUE KEYWORDS AS STRINGS}]
};Mapper Implementation
With the necessary files now set up, we can begin the actual creation of the OHDF mapper using the skeleton mapper base in the {YOUR-EXPORT-NAME-HERE}-mapper.ts file. The skeleton mapper and the base-converter class have been designed to provide the base functionality needed for *-to-HDF mapper generation. For most developers, mapper creation will be limited to assigning objects from the export structure to correlating attributes in the mapper according to the mappings they developed earlier.
File Processing
Certain security services produce exports which are not immediately usable by the skeleton mapper. In such case, pre-processing on the export and or post-processing on the generated OHDF file is necessary in order to ensure compatibility.
While developing the mapper, it's useful to occasionally check on the generated OHDF file to check that the mapper is working as intended. You can do this by starting a local instance of Heimdall with the following command:
yarn start:devNote that the yarn start:dev command will dynamically rebuild the application upon changes to the Heimdall frontend. However, if you make any changes to OHDF-Converters, you will have to restart the command entirely.
Import your source data and then export it as an OHDF JSON to check what your mapper is actually mapping.
Mapper Demo - GoSec
This section is a demonstration on implementing an OHDF mapper for GoSec, assuming that the appropriate file set up for the mapper has been performed.
Here is our developed mapping for GoSec for reference:
GoSec-to-OHDF Mapping
{
platform: { // We fill in Heimdall for the platform as it handles the generation of this OHDF file
name: 'Heimdall Tools',
release: HeimdallToolsVersion
},
version: HeimdallToolsVersion, // See 'platform' reasoning
statistics: {}, // Not enough info to fill
profiles: [
{
name: 'Gosec scanner', // We know that this report is generated from GoSec
version: GosecVersion, // Version of GoSec instance
sha256: '', // Leave it empty as OHDF Converters will generate one for us
title: 'gosec', // We know that this report is generated from GoSec
supports: [], // Not enough info to fill
attributes: [], // Not enough info to fill
groups: [], // Not enough info to fill
controls: [
{
id: Issues.rule_id, // ID of the requirement
title: Issues.details, // Human readable title for the requirement
desc: '', // Not enough info to fill
impact: 0.5, // Have no solid information on impact of security issue, so we default to 0.5
refs: [], // Not enough info to fill
tags: {
Issues.cwe // Associated CWE for the requirement
Issues.severity, // Severity of the requirement
Issues.confidence, // Applicability of the requirement
Issues.nosec, // Whether to ignore the requirement
Issues.suppressions // Info suppression level of the requirement
},
source_location: {}, // Not enough info to fill
results: [
{
status: 'failed', // The security scan only reports failed requirements, so all findings we receive get fail statuses
code_desc: Issues.code, // The code failing the requirement test
message: Issues.file + Issues.line + Issues.column, // All materials describing where the issue occurred
start_time // Not enough info to fill
}
]
},
],
status: 'loaded' // Give loaded status to denote that profile is loaded by OHDF Converters
},
],
passthrough: {
auxiliary_data: [
{ // Go source data compilation errors; Stats on GoSec scan
name: 'Gosec',
data: Golang errors, Stats
}
],
raw
}
}GoSec Annotated Source Data
{
// Purpose: Go compilation errors
// Recording: Metadata - Not specifically related to the requirements and will be already recorded as a security issue in 'Issues' if critical
"Golang errors": {},
// Purpose: Container for identified security issues
// Recording: Requirements - This entity records all identified security issues in a Go source code
"Issues": [
{
// Purpose: The severity of the identified issue
// Recording: Requirements - This is specifically related to the severity level of the requirement
"severity": "MEDIUM",
// Purpose: How sure that the identified issue if applicable to this source code
// Recording: Requirements testing - This field gives the applicability of the issue after source code testing
"confidence": "HIGH",
// Purpose: The associated CWE for the security issue
// Recording: Requirements - This gives the associated CWE for the security issue
"cwe": {
"id": "22",
"url": "https://cwe.mitre.org/data/definitions/22.html"
},
// Purpose: The internal GoSec ID for the security issue
// Recording: Requirements - This gives an ID for the security issue
"rule_id": "G304",
// Purpose: Explanation of the security issue
// Recording: Requirements - This explains the security issue
"details": "Potential file inclusion via variable",
// Purpose: The offending file
// Recording: Requirement testing - This specifically notes which file fails the requirement after source code testing
"file": "C:\\Users\\AGILLUM\\OneDrive - The MITRE Corporation\\Documents\\Code\\grype-0.34.4\\internal\\file\\tar.go",
// Purpose: The offending code
// Recording: Requirement testing - This specifies the code that fails the requirement after source code testing
"code": "82: \t\tcase tar.TypeReg:\n83: \t\t\tf, err := os.OpenFile(target, os.O_CREATE|os.O_RDWR, os.FileMode(header.Mode))\n84: \t\t\tif err != nil {\n",
// Purpose: The line number of the offending code
// Recording: Requirement testing - This field specifies the location of the failing code
"line": "83",
// Purpose: The column number of the offending code
// Recording: Requirement testing - This field specifies the location of the failing code
"column": "14",
// Purpose: Whether this security issue should be ignored
// Recording: Requirements - Specifies whether this security issue should be ignored
"nosec": false,
// Purpose: The supression level for info on the security issue
// Recording: Requirements - Specifies the info suppression level of the security issue
"suppressions": null
}
],
// Purpose: The statistics of the GoSec scan on the source code
// Recording: Metadata - Info on the scan itself
"Stats": {
"files": 199,
"lines": 12401,
"nosec": 0,
"found": 7
},
// Purpose: The version of the GoSec instance currently running
// Recording: Metadata - Info on the scan itself
"GosecVersion": "dev"
}Mapper creation has been streamlined to be as simple as possible for a developer. Most of the work involves simple references to the object path for a field in the source data as so:
version: {
path: "GosecVersion";
}This primarily applies in cases where the field is simply carried over. Some fields from the source data need to be processed or transformed in some way, which will be elaborated upon later.
Unfilled/Omitted and Hard Coded Fields
First, let's assign mappings which are unfilled/omitted or are not dependent on the source data (i.e., hard coded data). This include fields like our mappings for profiles.name and profiles.sha256
GoSec-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
export class GoSecMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: "Gosec scanner",
title: "gosec",
version: null, //Insert data
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
key: "id",
tags: {}, //Insert data
refs: [],
source_location: {},
title: null, //Insert data
id: "", //Insert data
desc: "",
impact: 0.5,
results: [
{
status: ExecJSON.ControlResultStatus.Failed,
code_desc: "", //Insert data
message: null, //Insert data
start_time: "",
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [{ name: "", data: _.omit([]) }], //Insert service name and mapped fields to be removed
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}Remaining GoSec-to-OHDF Mapping
{
profiles: [
{
version: GosecVersion, // Version of GoSec instance
controls: [
{
id: Issues.rule_id, // ID of the requirement
title: Issues.details, // Human readable title for the requirement
tags: {
Issues.cwe // Associated CWE for the requirement
Issues.severity, // Severity of the requirement
Issues.confidence, // Applicability of the requirement
Issues.nosec, // Whether to ignore the requirement
Issues.suppressions // Info suppression level of the requirement
},
results: [
{
code_desc: Issues.code, // The code failing the requirement test
message: Issues.file + Issues.line + Issues.column // All materials describing where the issue occurred
}
]
},
]
},
],
passthrough: {
auxiliary_data: [
{ // Go source data compilation errors; Stats on GoSec scan
name: 'Gosec',
data: Golang errors, Stats
}
],
raw
}
}Simple Portable Fields
Next, let's look at the fields which can be just simply be directly ported over from the source data like GosecVersion. To do this, we just need to invoke the path keyword from base-converter and feed the direct JSON object path as a value like so:
version: {
path: "GosecVersion";
}For nested fields (i.e., fields requiring traversal through parent fields), we need to have the mapper traverse into the level containing the fields we want to access. Think of this as a similar process as using cd to traverse through a directory to access a file. For example, if we can allow the access of fields using path in the Issues superfield in the source data as follows:
path: "Issues";Let's put this into practice and start implementing the mappings for simple fields that don't require transformation or processing:
GoSec-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
export class GoSecMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: "Gosec scanner",
title: "gosec",
version: { path: "GosecVersion" },
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "Issues",
key: "id",
tags: {
cwe: { path: "cwe" },
nosec: { path: "nosec" },
suppressions: { path: "suppressions" },
severity: { path: "severity" },
confidence: { path: "confidence" },
},
refs: [],
source_location: {},
title: { path: "details" },
id: { path: "rule_id" },
desc: "",
impact: 0.5,
results: [
{
status: ExecJSON.ControlResultStatus.Failed,
code_desc: { path: "code" },
message: null, //Insert data
start_time: "",
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [{ name: "", data: _.omit([]) }], //Insert service name and mapped fields to be removed
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}Remaining GoSec-to-OHDF Mapping
{
profiles: [
{
controls: [
{
results: [
{
message: Issues.file + Issues.line + Issues.column // All materials describing where the issue occurred
}
]
},
]
},
],
passthrough: {
auxiliary_data: [
{ // Go source data compilation errors; Stats on GoSec scan
name: 'Gosec',
data: Golang errors, Stats
}
],
raw
}
}Transformed/Processed Fields
Finally, let's look at fields that require some level of processing before we can use them in the OHDF mapper.
The fields like Issues.file, Issues.line, and Issues.column need to be combined to make a coherent locational message. We can combine them using a simple function that concatenates them together. The function to do this is provided:
function formatMessage(input: Record<string, unknown>): string {
return `${_.get(input, "file")}, line:${_.get(input, "line")}, column:${_.get(
input,
"column"
)}`;
}This function just pulls the three fields and joins them into a single string.
To invoke this, we can use the transformer keyword and feed the function as a value for the transformer to invoke. This is implemented as follows:
GoSec-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
function formatMessage(input: Record<string, unknown>): string {
return `${_.get(input, "file")}, line:${_.get(input, "line")}, column:${_.get(
input,
"column"
)}`;
}
export class GoSecMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: "Gosec scanner",
title: "gosec",
version: { path: "GosecVersion" },
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "Issues",
key: "id",
tags: {
cwe: { path: "cwe" },
nosec: { path: "nosec" },
suppressions: { path: "suppressions" },
severity: { path: "severity" },
confidence: { path: "confidence" },
},
refs: [],
source_location: {},
title: { path: "details" },
id: { path: "rule_id" },
desc: "",
impact: 0.5,
results: [
{
status: ExecJSON.ControlResultStatus.Failed,
code_desc: { path: "code" },
message: { transformer: formatMessage },
start_time: "",
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [{ name: "", data: _.omit([]) }], //Insert service name and mapped fields to be removed
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}Remaining GoSec-to-OHDF Mapping
{
passthrough: {
auxiliary_data: [
{ // Go source data compilation errors; Stats on GoSec scan
name: 'Gosec',
data: Golang errors, Stats
}
],
raw
}
}We can also combine keywords such as using path to traverse to a particular field and then apply a function using transformer. For example, we can target the Issues.cwe field specifically and apply the following function to create corresponding NIST 800-53s:
import { CweNistMapping } from "./mappings/CweNistMapping";
const CWE_NIST_MAPPING = new CweNistMapping();
const DEFAULT_NIST_TAG = ["SI-2", "RA-5"];
function nistTag(input: Record<string, unknown>): string[] {
const cwe = [`${_.get(input, "id")}`];
return CWE_NIST_MAPPING.nistFilter(cwe, DEFAULT_NIST_TAG);
}We can then correlate it with a profiles.controls.tags.nist field in the OHDF mapper as so:
GoSec-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
import { CweNistMapping } from "./mappings/CweNistMapping";
const CWE_NIST_MAPPING = new CweNistMapping();
const DEFAULT_NIST_TAG = ["SI-2", "RA-5"];
function nistTag(input: Record<string, unknown>): string[] {
const cwe = [`${_.get(input, "id")}`];
return CWE_NIST_MAPPING.nistFilter(cwe, DEFAULT_NIST_TAG);
}
function formatMessage(input: Record<string, unknown>): string {
return `${_.get(input, "file")}, line:${_.get(input, "line")}, column:${_.get(
input,
"column"
)}`;
}
export class GoSecMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: "Gosec scanner",
title: "gosec",
version: { path: "GosecVersion" },
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "Issues",
key: "id",
tags: {
nist: {
path: "cwe",
transformer: nistTag,
},
cwe: { path: "cwe" },
nosec: { path: "nosec" },
suppressions: { path: "suppressions" },
severity: { path: "severity" },
confidence: { path: "confidence" },
},
refs: [],
source_location: {},
title: { path: "details" },
id: { path: "rule_id" },
desc: "",
impact: 0.5,
results: [
{
status: ExecJSON.ControlResultStatus.Failed,
code_desc: { path: "code" },
message: { transformer: formatMessage },
start_time: "",
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [{ name: "", data: _.omit([]) }], //Insert service name and mapped fields to be removed
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}Remaining GoSec-to-OHDF Mapping
{
passthrough: {
auxiliary_data: [
{ // Go source data compilation errors; Stats on GoSec scan
name: 'Gosec',
data: Golang errors, Stats
}
],
raw
}
}For the remaining fields that we want to place in passthrough, we need to use the lodash library to preserve the field in its entirety. In particular, we will use the _.get(OBJECT, FIELD) command to pull a field from the source data object.
GoSec-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
} from "./base-converter";
import { CweNistMapping } from "./mappings/CweNistMapping";
const CWE_NIST_MAPPING = new CweNistMapping();
const DEFAULT_NIST_TAG = ["SI-2", "RA-5"];
function nistTag(input: Record<string, unknown>): string[] {
const cwe = [`${_.get(input, "id")}`];
return CWE_NIST_MAPPING.nistFilter(cwe, DEFAULT_NIST_TAG);
}
function formatMessage(input: Record<string, unknown>): string {
return `${_.get(input, "file")}, line:${_.get(input, "line")}, column:${_.get(
input,
"column"
)}`;
}
export class GoSecMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: "Gosec scanner",
title: "gosec",
version: { path: "GosecVersion" },
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "Issues",
key: "id",
tags: {
nist: {
path: "cwe",
transformer: nistTag,
},
cwe: { path: "cwe" },
nosec: { path: "nosec" },
suppressions: { path: "suppressions" },
severity: { path: "severity" },
confidence: { path: "confidence" },
},
refs: [],
source_location: {},
title: { path: "details" },
id: { path: "rule_id" },
desc: "",
impact: 0.5,
results: [
{
status: ExecJSON.ControlResultStatus.Failed,
code_desc: { path: "code" },
message: { transformer: formatMessage },
start_time: "",
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, any>): Record<string, unknown> => {
return {
auxiliary_data: [
{
name: "Gosec",
data: {
"Golang errors": _.get(data, "Golang errors"),
Stats: _.get(data, "Stats"),
},
},
],
...(this.withRaw && { raw: data }),
};
},
},
};
constructor(exportJson: string, withRaw = false) {
super(JSON.parse(exportJson), true);
this.withRaw = withRaw;
}
}Now we have a fully implemented GoSec-to-OHDF mapper.
Mapper Demo - DbProtect
This section is a demonstration for implementing an OHDF mapper that handles XML-based source data, namely DbProtect. As in the previous section, we assume that the appropriate file set up for the mapper has been performed.
Here is our developed mapping for DbProtect for reference:
DbProtect-to-OHDF Mapping
{
platform: {
name: 'Heimdall Tools',
release: HeimdallToolsVersion,
target_id
},
version: HeimdallToolsVersion,
statistics: {
duration
},
profiles: [
{
name: 'Policy',
version,
sha256,
title: 'Job Name',
maintainer,
summary: ['Organization', 'Asset', 'Asset Type', 'IP Address, Port, Instance'],
license,
copyright,
copyright_email,
supports,
attributes,
groups,
controls: [
{
id: 'Check ID', // ID of the requirement
title: 'Check',
desc: ['Task', 'Check Category'],
descriptions,
impact: 'Risk DV',
refs,
tags: {
nist: DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS,
cci: DEFAULT_STATIC_CODE_ANALYSIS_CCI_TAGS
},
code,
source_location,
results: [
{
status: 'Result Status', // The result of the scan for that particular control
code_desc: 'Details',
message,
run_time,
start_time: 'Date' // Some indication of when the scan was run
}
]
},
],
status: 'loaded'
},
],
passthrough: {
auxiliary_data: [
{
name,
data
},
],
raw
}
}DbProtect Annotated Source Data
<?xml version="1.0" encoding="utf-8"?>
<dataset xmlns="http://developer.cognos.com/schemas/xmldata/1/" xmlns:xs="http://www.w3.org/2001/XMLSchema-instance">
<!-- Purpose: Assigning keys to the value fields in the data subsection -->
<!-- Recording: Metadata - Labels of the values (might be useful when parsing) -->
<metadata>
<item name="Organization" type="xs:string" length="202"/>
<item name="Task" type="xs:string" length="802"/>
<item name="Asset Type" type="xs:string" length="66"/>
<item name="Asset" type="xs:string" length="1026"/>
<item name="IP Address, Port, Instance" type="xs:string" length="532"/>
<item name="Job Name" type="xs:string" length="258"/>
<item name="Policy" type="xs:string" length="130"/>
<item name="Result Status" type="xs:string" length="802"/>
<item name="Check Category" type="xs:string" length="802"/>
<item name="Risk DV" type="xs:string" length="802"/>
<item name="Check ID" type="xs:int" precision="1"/>
<item name="Check" type="xs:string" length="1026"/>
<item name="Details" type="xs:string" length="8002"/>
<item name="Date" type="xs:string" length="54"/>
</metadata>
<data>
<row>
<!-- Purpose: Customer's organization name -->
<!-- Recording: Metadata. This just tells us about the scan itself, not the compliance-related items -->
<value>TEST ORGANIZATION (Local DBP server)</value>
<!-- Purpose: Vendor's term for the type of scan, in this case an "Audit" -->
<!-- Recording: Metadata. Again, not telling us about any compliance-related items -->
<value>Audit</value>
<!-- Purpose: Type of database scanned by the DBProtect tool -->
<!-- Recording: Metadata. This one is up for debate. The type of database is not directly related to controls, but we could include it in the results if we want. -->
<value>Microsoft SQL Server</value>
<!-- Purpose: Name of the Server hosting the database -->
<!-- Recording: Metadata, for similar reasons as the database type. -->
<value>CONDS181</value>
<!-- Purpose: Database's IP, Port, Instance -->
<!-- Recording: Metadata. See above. -->
<value>10.0.10.204, 1433, MSSQLSERVER</value>
<!-- Purpose: Customer's added name for the report -->
<!-- Recording: Metadata. See above. -->
<value>Heimdall Test scan report generation</value>
<!-- Purpose: The baseline guidance that the scan tested against -->
<!-- Recording: Requirement. This is more general information, but still compliance-related so we can consider it a requirement. -->
<value>DISA-STIG SQL Server 2016 V2R1-1 Audit (Built-In)</value>
<!-- Purpose: Result of the test -->
<!-- Recording: Requirement testing. This tells us whether the control was fulfilled or not. -->
<value>Fact</value>
<!-- Purpose: Type of requirement -->
<!-- Recording: Requirements. This tells us the type of control being tested. -->
<value>Improper Access Controls</value>
<!-- Purpose: Severity of the requirement -->
<!-- Recording: Requirements. Tells us more about the control being tested. -->
<value>Medium</value>
<!-- Purpose: Vendor's unique ID for the type of test -->
<!-- Recording: Requirements. This is a unique identifier for each test we are running. We may want to collapse results of the same ID when making our mapper. -->
<value>2986</value>
<!-- Purpose: Check type -->
<!-- Recording: Requirements. This tells us about the type of control being tested as well. -->
<value>Schema ownership</value>
<!-- Purpose: Result details -->
<!-- Recording: Requirements. This tells us about the specific areas related to the control. -->
<value>Schema name=DatabaseMailUserRole;Database=msdb;Owner name=DatabaseMailUserRole</value>
<!-- Purpose: Date of the scan/tests -->
<!-- Recording: Requirement testing. This tells us when the scan was run. -->
<value>Feb 18 2021 15:57</value>
</row>
...
</data>
</dataset>As in the previous GoSec example, most of the work involves simple references to the object path for a field in the source data. For example:
title: {
path: "Check";
}Again, some fields from the source data need to be processed or transformed in some way, which will be elaborated upon later.
Unfilled/Omitted and Hard Coded Fields
First, let's assign mappings which are unfilled/omitted or are not dependent on the source data (i.e., hard coded data). This include fields like our mappings for platform.name and profiles.sha256. Note that for our tags, we use the global method in the Heimdall repo, getCCIsForNISTTags, as well as the global constant DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS. We can also just directly put the source data into the passthrough. Also note the compileFindings function we wrote to deal with the strange structure of the XML, which has metadata correlated with each row in the data field. No need to worry about this function for now - just think of it as a way of assigning the metadata keys to each data row explicitly, so that our base converter has a way to easily access fields and know what they are.
DbProtect-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import * as _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
parseXml,
} from "./base-converter";
function compileFindings(
input: Record<string, unknown>
): Record<string, unknown> {
const keys = _.get(input, "dataset.metadata.item");
const findings = _.get(input, "dataset.data.row");
let output: unknown[] = [];
if (Array.isArray(keys) && Array.isArray(findings)) {
const keyNames = keys.map((element: Record<string, unknown>): string => {
return _.get(element, "name") as string;
});
output = findings.map((element: Record<string, unknown>) => {
return Object.fromEntries(
keyNames.map(function (name: string, i: number) {
return [name, _.get(element, `value[${i}]`)];
})
);
});
}
return Object.fromEntries([["data", output]]);
}
export class DBProtectMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: {},
title: {},
summary: {},
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
key: "id",
tags: {
nist: DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS,
cci: getCCIsForNISTTags(DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS),
},
refs: [],
source_location: {},
title: {},
id: {},
desc: {},
impact: {},
code: {},
results: [
{
status: {},
code_desc: {},
start_time: {},
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, unknown>): Record<string, unknown> => {
return { ...(this.withRaw && { raw: data }) };
},
},
};
constructor(dbProtectXml: string, withRaw = false) {
super(compileFindings(parseXml(dbProtectXml)));
this.withRaw = withRaw;
}
}Remaining DbProtect-to-OHDF Mapping
{
profiles: [
{
name: "Policy",
title: "Job Name",
maintainer,
summary: [
"Organization",
"Asset",
"Asset Type",
"IP Address, Port, Instance",
],
license,
copyright,
copyright_email,
supports,
attributes,
groups,
controls: [
{
id: "Check ID", // ID of the requirement
title: "Check",
desc: ["Task", "Check Category"],
descriptions,
impact: "Risk DV",
refs,
code,
source_location,
results: [
{
status: "Result Status", // The result of the scan for that particular control
code_desc: "Details",
message,
run_time,
start_time: "Date", // Some indication of when the scan was run
},
],
},
],
},
];
}Simple Portable Fields
Next, let's look at the fields which can be just simply be directly ported over from the source data like Policy. To do this, we just need to invoke the path keyword from base-converter and feed the direct JSON object path as a value like so:
name: {
path: "data.[0].Policy";
}Note that in this example, we must call the path with data.[0], since even though the Policy is the same for all data rows, we still must specify an element to extract the data from.
Let's put this into practice and start implementing the mappings for simple fields that don't require transformation or processing:
DbProtect-to-OHDF Mapper
import { ExecJSON } from "inspecjs";
import * as _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
parseXml,
} from "./base-converter";
function compileFindings(
input: Record<string, unknown>
): Record<string, unknown> {
const keys = _.get(input, "dataset.metadata.item");
const findings = _.get(input, "dataset.data.row");
let output: unknown[] = [];
if (Array.isArray(keys) && Array.isArray(findings)) {
const keyNames = keys.map((element: Record<string, unknown>): string => {
return _.get(element, "name") as string;
});
output = findings.map((element: Record<string, unknown>) => {
return Object.fromEntries(
keyNames.map(function (name: string, i: number) {
return [name, _.get(element, `value[${i}]`)];
})
);
});
}
return Object.fromEntries([["data", output]]);
}
export class DBProtectMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: { path: "data.[0].Policy" },
title: { path: "data.[0].Job Name" },
summary: {},
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "data",
key: "id",
tags: {
nist: DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS,
cci: getCCIsForNISTTags(DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS),
},
refs: [],
source_location: {},
title: { path: "Check" },
id: {},
desc: {},
impact: {},
code: {},
results: [
{
status: {},
code_desc: { path: "Details" },
start_time: { path: "Date" },
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, unknown>): Record<string, unknown> => {
return { ...(this.withRaw && { raw: data }) };
},
},
};
constructor(dbProtectXml: string, withRaw = false) {
super(compileFindings(parseXml(dbProtectXml)));
this.withRaw = withRaw;
}
}Remaining DbProtect-to-OHDF Mapping
{
profiles: [
{
maintainer,
summary: [
"Organization",
"Asset",
"Asset Type",
"IP Address, Port, Instance",
],
license,
copyright,
copyright_email,
supports,
attributes,
groups,
controls: [
{
id: "Check ID", // ID of the requirement
desc: ["Task", "Check Category"],
descriptions,
impact: "Risk DV",
refs,
code,
source_location,
results: [
{
status: "Result Status", // The result of the scan for that particular control
message,
run_time,
},
],
},
],
},
];
}Transformed/Processed Fields
As in the previous example, there are several fields that need to be processed further. To do so, we can make use of the transformer field supported by the OHDF Converters library. Some simple transformers include casting the id field to a string, and mapping the impact fields properly (by creating a mapping and passing it into the impactMapping function in base-converter):
const IMPACT_MAPPING: Map<string, number> = new Map([
["high", 0.7],
["medium", 0.5],
["low", 0.3],
["informational", 0],
]);
function idToString(id: unknown): string {
if (typeof id === "string" || typeof id === "number") {
return id.toString();
} else {
return "";
}
}For the OHDF fields that concatenate information from multiple DbProtect fields, we must write corresponding transformers that format these strings:
function formatSummary(entry: unknown): string {
const text = [];
text.push(`Organization : ${_.get(entry, "Organization")}`);
text.push(`Asset : ${_.get(entry, "Check Asset")}`);
text.push(`Asset Type : ${_.get(entry, "Asset Type")}`);
text.push(`IP Address, Port, Instance : ${_.get(entry, "Asset Type")}`);
text.push(
`IP Address, Port, Instance : ${_.get(
entry,
"IP Address, Port, Instance"
)} `
);
return text.join("\n");
}
function formatDesc(entry: unknown): string {
const text = [];
text.push(`Task : ${_.get(entry, "Task")}`);
text.push(`Check Category : ${_.get(entry, "Check Category")}`);
return text.join("; ");
}Finally, we write one more function to map the Result Status to the proper ExecJSON statuses:
function getStatus(input: unknown): ExecJSON.ControlResultStatus {
switch (input) {
case "Fact":
return ExecJSON.ControlResultStatus.Skipped;
case "Failed":
return ExecJSON.ControlResultStatus.Failed;
case "Finding":
return ExecJSON.ControlResultStatus.Failed;
case "Not A Finding":
return ExecJSON.ControlResultStatus.Passed;
}
return ExecJSON.ControlResultStatus.Skipped;
}Writing out all these transformers and applying them to the mapping fed into base-converter looks something like this:
Full Mapper Code
import { ExecJSON } from "inspecjs";
import * as _ from "lodash";
import { version as HeimdallToolsVersion } from "../package.json";
import {
BaseConverter,
ILookupPath,
impactMapping,
MappedTransform,
parseXml,
} from "./base-converter";
import {
DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS,
getCCIsForNISTTags,
} from "./utils/global";
const IMPACT_MAPPING: Map<string, number> = new Map([
["high", 0.7],
["medium", 0.5],
["low", 0.3],
["informational", 0],
]);
function compileFindings(
input: Record<string, unknown>
): Record<string, unknown> {
const keys = _.get(input, "dataset.metadata.item");
const findings = _.get(input, "dataset.data.row");
let output: unknown[] = [];
if (Array.isArray(keys) && Array.isArray(findings)) {
const keyNames = keys.map((element: Record<string, unknown>): string => {
return _.get(element, "name") as string;
});
output = findings.map((element: Record<string, unknown>) => {
return Object.fromEntries(
keyNames.map(function (name: string, i: number) {
return [name, _.get(element, `value[${i}]`)];
})
);
});
}
return Object.fromEntries([["data", output]]);
}
function formatSummary(entry: unknown): string {
const text = [];
text.push(`Organization : ${_.get(entry, "Organization")}`);
text.push(`Asset : ${_.get(entry, "Check Asset")}`);
text.push(`Asset Type : ${_.get(entry, "Asset Type")}`);
text.push(`IP Address, Port, Instance : ${_.get(entry, "Asset Type")}`);
text.push(
`IP Address, Port, Instance : ${_.get(
entry,
"IP Address, Port, Instance"
)} `
);
return text.join("\n");
}
function formatDesc(entry: unknown): string {
const text = [];
text.push(`Task : ${_.get(entry, "Task")}`);
text.push(`Check Category : ${_.get(entry, "Check Category")}`);
return text.join("; ");
}
function getStatus(input: unknown): ExecJSON.ControlResultStatus {
switch (input) {
case "Fact":
return ExecJSON.ControlResultStatus.Skipped;
case "Failed":
return ExecJSON.ControlResultStatus.Failed;
case "Finding":
return ExecJSON.ControlResultStatus.Failed;
case "Not A Finding":
return ExecJSON.ControlResultStatus.Passed;
}
return ExecJSON.ControlResultStatus.Skipped;
}
function idToString(id: unknown): string {
if (typeof id === "string" || typeof id === "number") {
return id.toString();
} else {
return "";
}
}
export class DBProtectMapper extends BaseConverter {
withRaw: boolean;
mappings: MappedTransform<
ExecJSON.Execution & { passthrough: unknown },
ILookupPath
> = {
platform: {
name: "Heimdall Tools",
release: HeimdallToolsVersion,
},
version: HeimdallToolsVersion,
statistics: {},
profiles: [
{
name: { path: "data.[0].Policy" },
title: { path: "data.[0].Job Name" },
summary: { path: "data.[0]", transformer: formatSummary },
supports: [],
attributes: [],
groups: [],
status: "loaded",
controls: [
{
path: "data",
key: "id",
tags: {
nist: DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS,
cci: getCCIsForNISTTags(DEFAULT_STATIC_CODE_ANALYSIS_NIST_TAGS),
},
refs: [],
source_location: {},
title: { path: "Check" },
id: { path: "Check ID", transformer: idToString },
desc: { transformer: formatDesc },
impact: {
path: "Risk DV",
transformer: impactMapping(IMPACT_MAPPING),
},
code: {
transformer: (vulnerability: Record<string, unknown>): string =>
JSON.stringify(vulnerability, null, 2),
},
results: [
{
status: { path: "Result Status", transformer: getStatus },
code_desc: { path: "Details" },
start_time: { path: "Date" },
},
],
},
],
sha256: "",
},
],
passthrough: {
transformer: (data: Record<string, unknown>): Record<string, unknown> => {
return { ...(this.withRaw && { raw: data }) };
},
},
};
constructor(dbProtectXml: string, withRaw = false) {
super(compileFindings(parseXml(dbProtectXml)));
this.withRaw = withRaw;
}
}Now we have a fully implemented DbProtect-to-OHDF mapper.