Appendix - SAF CLI Integration
SAF CLI Integration
The following is a supplemental lesson on how to integrate your developed mapper with the SAF CLI. Integration with the SAF CLI will allow your mapper to be used in a command line environment, independent of Heimdall2 throughout.
If you have not yet created a mapper, please follow the primary course and do so. Once completed, you will be able to continue with this lesson.
Set Up
First, we need set up the necessary files to begin integrating your mapper with the SAF CLI.
Create a development branch against the SAF CLI repository and create a draft pull request for your new branch.
In the
package.jsonfile, update the versions of@mitre/hdf-convertersand@mitre/heimdall-liteto the latest release of Heimdall2.In the
src/commands/convertdirectory, create a blank TypeScript file. It should be named:
{YOUR-EXPORT-NAME-HERE}2hdf.ts- In the
test/sample_datadirectory, create a directory named{YOUR-EXPORT-NAME-HERE}. Underneath it, create a directory namedsample_input_report. The file structure should now look like this:
+-- sample_data
| +-- {YOUR-EXPORT-NAME-HERE}
| | +-- sample_input_report- Place your sample export under the
sample_input_reportdirectory. Your sample export should be genericized to avoid any leaking of sensitive information. Under the{YOUR-EXPORT-NAME-HERE}directory, place your output OHDF files generated during the original testing phase of your mapper development. The file structure should now look like this:
+-- sample_data
| +-- {YOUR-EXPORT-NAME-HERE}
| | +-- sample_input_report
| | | +-- {YOUR-SAMPLE-EXPORT}
| | +-- {YOUR-EXPORT-NAME-HERE}-hdf.json
| | +-- {YOUR-EXPORT-NAME-HERE}-hdf-withraw.json- In the
test/commands/convertdirectory, create a blank TypeScript file. It should be named:
{YOUR-EXPORT-NAME-HERE}2hdf.test.tsIntegration
APIs
If your security tool provides an API instead of a security data export, you will need to explore the API to determine what information may need to be collected and collated for use in the convert command. For further guidance, refer to previous API-based command implementations or contact the SAF team for help.
With file set up out of the way, we can begin filling out the necessary files to create a saf convert command for our mapper.
Info
For further guidance on writing good usage strings for command line interfaces, refer here.
- Insert the skeleton convert command file (see below) in the file
{YOUR-EXPORT-NAME-HERE}2hdf.tswhich you created earlier. Replace names (SKELETONby default) as necessary.
Skeleton Convert Command File
import {Command, Flags} from '@oclif/core'
import fs from 'fs'
import {SKELETONMapper as Mapper} from '@mitre/hdf-converters'
import {checkSuffix} from '../../utils/global'
export default class SKELETON2HDF extends Command {
static usage = 'convert SKELETON2hdf -i <SKELETON-json> -o <hdf-scan-results-json>'
static description = 'Translate a SKELETON output file into an HDF results set'
static examples = ['saf convert SKELETON2hdf -i SKELETON.json -o output-hdf-name.json']
static flags = {
help: Flags.help({char: 'h'}),
input: Flags.string({char: 'i', required: true, description: 'Input SKELETON file'}),
output: Flags.string({char: 'o', required: true, description: 'Output HDF file'}),
'with-raw': Flags.boolean({char: 'w', required: false}),
}
async run() {
const {flags} = await this.parse(SKELETON2HDF)
const input = fs.readFileSync(flags.input, 'utf8')
const converter = new Mapper(input, flags.['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
}
}- Insert the appropriate skeleton convert command test file (see below) in the file
{YOUR-EXPORT-NAME-HERE}2hdf.test.tswhich you created earlier. Replace names (SKELETONby default) as necessary.
JSON Skeleton Convert Command Test File
import { expect, test } from "@oclif/test";
import tmp from "tmp";
import path from "path";
import fs from "fs";
import { omitHDFChangingFields } from "../utils";
describe("Test SKELETON", () => {
const tmpobj = tmp.dirSync({ unsafeCleanup: true });
test
.stdout()
.command([
"convert SKELETON2hdf",
"-i",
path.resolve(
"./test/sample_data/SKELETON/sample_input_report/SKELETON_sample.json"
),
"-o",
`${tmpobj.name}/SKELETONtest.json`,
])
.it("hdf-converter output test", () => {
const converted = JSON.parse(
fs.readFileSync(`${tmpobj.name}/SKELETONtest.json`, "utf8")
);
const sample = JSON.parse(
fs.readFileSync(
path.resolve("./test/sample_data/SKELETON/SKELETON-hdf.json"),
"utf8"
)
);
expect(omitHDFChangingFields(converted)).to.eql(
omitHDFChangingFields(sample)
);
});
});
describe("Test SKELETON withraw flag", () => {
const tmpobj = tmp.dirSync({ unsafeCleanup: true });
test
.stdout()
.command([
"convert SKELETON2hdf",
"-i",
path.resolve(
"./test/sample_data/SKELETON/sample_input_report/SKELETON_sample.json"
),
"-o",
`${tmpobj.name}/SKELETONtest.json`,
"-w",
])
.it("hdf-converter withraw output test", () => {
const converted = JSON.parse(
fs.readFileSync(`${tmpobj.name}/SKELETONtest.json`, "utf8")
);
const sample = JSON.parse(
fs.readFileSync(
path.resolve("./test/sample_data/SKELETON/SKELETON-hdf-withraw.json"),
"utf8"
)
);
expect(omitHDFChangingFields(converted)).to.eql(
omitHDFChangingFields(sample)
);
});
});XML Skeleton Convert Command Test File
import { expect, test } from "@oclif/test";
import tmp from "tmp";
import path from "path";
import fs from "fs";
import { omitHDFChangingFields } from "../utils";
describe("Test SKELETON", () => {
const tmpobj = tmp.dirSync({ unsafeCleanup: true });
test
.stdout()
.command([
"convert SKELETON2hdf",
"-i",
path.resolve(
"./test/sample_data/SKELETON/sample_input_report/SKELETON_sample.xml"
),
"-o",
`${tmpobj.name}/SKELETONtest.json`,
])
.it("hdf-converter output test", () => {
const converted = JSON.parse(
fs.readFileSync(`${tmpobj.name}/SKELETONtest.json`, "utf8")
);
const sample = JSON.parse(
fs.readFileSync(
path.resolve("./test/sample_data/SKELETON/SKELETON-hdf.json"),
"utf8"
)
);
expect(omitHDFChangingFields(converted)).to.eql(
omitHDFChangingFields(sample)
);
});
});
describe("Test SKELETON withraw flag", () => {
const tmpobj = tmp.dirSync({ unsafeCleanup: true });
test
.stdout()
.command([
"convert SKELETON2hdf",
"-i",
path.resolve(
"./test/sample_data/SKELETON/sample_input_report/SKELETON_sample.xml"
),
"-o",
`${tmpobj.name}/SKELETONtest.json`,
"-w",
])
.it("hdf-converter withraw output test", () => {
const converted = JSON.parse(
fs.readFileSync(`${tmpobj.name}/SKELETONtest.json`, "utf8")
);
const sample = JSON.parse(
fs.readFileSync(
path.resolve("./test/sample_data/SKELETON/SKELETON-hdf-withraw.json"),
"utf8"
)
);
expect(omitHDFChangingFields(converted)).to.eql(
omitHDFChangingFields(sample)
);
});
});- Navigate to the
index.tsfile under thesrc/commands/convertdirectory. Import your mapper using the existing import block as follows:
import {
ASFFResults,
...
{YOUR-MAPPER-CLASS-HERE}
} from '@mitre/hdf-converters'- Under the switch block in the
getFlagsForInputFilefunction, add your mapper class as it is defined for fingerprinting for the generic convert command. If the convert command for your mapper has any additional flags beyond the standard--inputand--outputflags, return the entire flag block in the switch case. This is demonstrated as follows:
switch (Convert.detectedType) {
...
case {YOUR-EXPORT-SERVICE-NAME-HERE}:
return {YOUR-CLI-CONVERT-CLASS}.flags //Only add if special flags exist
...
return {}
}- Edit the
README.mdfile to reflect your newly added conversion command under theConvert To HDFsection. It should be formatted as follows:
##### {YOUR-EXPORT-NAME-HERE} to HDF
\```
convert {YOUR-EXPORT-NAME-HERE}2hdf Translate a {YOUR-EXPORT-NAME-HERE} results {EXPORT-TYPE} into an HDF results set
USAGE
$ saf convert {YOUR-EXPORT-NAME-HERE}hdf -i <{INPUT-NAME}> -o <hdf-scan-results-json>
OPTIONS
-i, --input=input Input {EXPORT-TYPE} File
-o, --output=output Output HDF JSON File
-w, --with-raw Include raw input file in HDF JSON file
EXAMPLES
$ saf convert {YOUR-EXPORT-NAME-HERE}2hdf -i {INPUT-NAME} -o output-hdf-name.json
\```- Commit your changes and mark your pull request as 'ready for review'. You should request for a code review from members of the SAF team and edit your code as necessary. Once approved, merged, and released, your mapper will be callable using the SAF CLI.