Import scan reports

DefectDojo has the ability to import scan reports from a large number of security tools.

Security Tools

Acunetix Scanner

XML format

Acunetix 360 Scanner

Vulnerabilities List - JSON report

Anchore-Engine

JSON vulnerability report generated by anchore-cli tool, using a command like anchore-cli --json image vuln <image:tag> all

Aqua

JSON report format.

Anchore Grype

Anchore Grype JSON report format generated with -o json option.

grype defectdojo/defectdojo-django:1.13.1 -o json > many_vulns.json

Arachni Scanner

Arachni Web Scanner (http://arachni-scanner.com/wiki)

Reports are generated with arachni_reporter tool this way:

arachni_reporter --reporter 'json' js.com.afr

AppSpider (Rapid7)

Use the VulnerabilitiesSummary.xml file found in the zipped report download.

AuditJS (OSSIndex)

AuditJS scanning tool using OSSIndex database and generated with --json or -j option (https://www.npmjs.com/package/auditjs).

auditjs ossi --json > auditjs_report.json

AWS Security Hub

The JSON output from AWS Security Hub exported with the aws securityhub get-findings (https://docs.aws.amazon.com/cli/latest/reference/securityhub/get-findings.html) command.

AWS Scout2 Scanner (deprecated)

JS file in scout2-report/inc-awsconfig/aws_config.js.

AWS Prowler Scanner

Prowler file can be imported as a CSV (-M csv) or JSON (-M json) file.

Bandit

JSON report format

Blackduck Hub

2 options:

  • Import the zip file as can be created by Blackduck export. The zip file must contain the security.csv and files.csv in order to produce findings that bear file locations information.
  • Import a single security.csv file. Findings will not have any file location information.

Brakeman Scan

Import Brakeman Scanner findings in JSON format.

Bugcrowd

Import Bugcrowd results in CSV format.

Bundler-Audit

Import the text output generated with bundle-audit check

Burp XML

When the Burp report is generated, the recommended option is Base64 encoding both the request and response fields - e.g. check the box that says "Base64-encode requests and responses". These fields will be processed and made available in the 'Finding View' page.

Burp Enterprise Scan

Import HTML reports from Burp Enterprise Edition

Burp GraphQL

Import the JSON data returned from the BurpSuite Enterprise GraphQL API. Append all the issues returned to a list and save it as the value for the key “Issues”. There is no need to filter duplicates, the parser will automatically combine issues with the same name.

Example:

{
    "Issues": [
        {
            "issue_type": {
                "name": "Cross-site scripting (reflected)",
                "description_html": "Issue Description",
                "remediation_html": "Issue Remediation",
                "vulnerability_classifications_html": "<li><a href=\"https://cwe.mitre.org/data/definitions/79.html\">CWE-79: Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting')</a></li>",
                "references_html": "<li><a href=\"https://portswigger.net/web-security/cross-site-scripting\">Cross-site scripting</a></li>"
            },
            "description_html": "Details",
            "remediation_html": "Remediation Details",
            "severity": "high",
            "path": "/burp",
            "origin": "https://portswigger.net",
            "evidence": [
                {
                    "request_index": 0,
                    "request_segments": [
                        {
                            "data_html": "GET"
                        },
                        {
                            "highlight_html": "data"
                        },
                        {
                            "data_html": " HTTP More data"
                        }
                    ]
                },
                {
                    "response_index": 0,
                    "response_segments": [
                        {
                            "data_html": "HTTP/2 200 OK "
                        },
                        {
                            "highlight_html": "data"
                        },
                        {
                            "data_html": "More data"
                        }
                    ]
                }
            ]
        }
    ]
}

Example GraphQL query to get issue details:

    query Issue ($id: ID!, $serial_num: ID!) {
        issue(scan_id: $id, serial_number: $serial_num) {
            issue_type {
                name
                description_html
                remediation_html
                vulnerability_classifications_html
                references_html
            }
            description_html
            remediation_html
            severity
            path
            origin
            evidence {
                ... on Request {
                    request_index
                    request_segments {
                        ... on DataSegment {
                            data_html
                        }
                        ... on HighlightSegment {
                                highlight_html
                        }
                    }
                }
                ... on Response {
                    response_index
                    response_segments {
                        ... on DataSegment {
                            data_html
                        }
                        ... on HighlightSegment {
                            highlight_html
                        }
                    }
                }
            }
        }
    }

CargoAudit Scan

Import JSON output of cargo-audit scan report https://crates.io/crates/cargo-audit

CCVS Report

Import JSON reports from [CCVS API](https://github.com/William-Hill-Online/CCVS-API)

Checkov Report

Import JSON reports of Infrastructure as Code vulnerabilities.

Clair Scan

Import JSON reports of Docker image vulnerabilities.

Clair Klar Scan

Import JSON reports of Docker image vulnerabilities from clair klar client.

Cobalt.io Scan

CSV Report

Cobalt.io API Import

Import findings from the Cobalt.io API - no file required.

Follow these steps to setup API importing:

  1. Configure the Cobalt.io Authentication details by navigating to Configuration->Tool Configuration, selecting the Tool Type to “Cobalt.io”, and Authentication Type “API Key”. Paste your Cobalt.io API key in the “API Key” input and the desired org token in the “Extras” input.
  2. In the Product settings select “Add Cobalt.io Configuration”. Provide the ID of the asset from which to import findings. The ID can be found at the end of the URL when viewing the asset in your browser. Also select the appropriate “Cobalt.io” configuration.
  3. After this is you can import the findings as a scan by selecting “Cobalt.io API Import” as the scan type. If you have more than one asset configured you must also select which “Cobalt.io Config” to use.

CodeQL

CodeQL can be used to generate a SARIF report, that can be imported into Defect Dojo:

codeql database analyze db python-security-and-quality.qls --sarif-add-snippets --format=sarif-latest --output=security-extended.sarif

The same can be achieved by running the CodeQL GitHub action with the add-snippet property set to true.

Coverity API

Export Coverity API view data in JSON format (/api/viewContents/issues endpoint).

Currently these columns are mandatory:

  • displayType (Type in the UI)
  • displayImpact (Impact in the UI)
  • status (Status in the UI)
  • firstDetected (First Detected in the UI)

Other supported attributes: cwe, displayFile, occurrenceCount and firstDetected

Crashtest Security

Import JSON Report Import XML Report in JUnit Format

CredScan Report

Import CSV credential scanner reports

Contrast Scanner

CSV Report

Checkmarx

  • Checkmarx Scan, Checkmarx Scan detailed: XML report from Checkmarx SAST (source code analysis)
  • Checkmarx OSA: json report from Checkmarx Open Source Analysis (dependencies analysis)

To generate the OSA report using Checkmarx CLI: ./runCxConsole.sh OsaScan -v -CxServer <...> -CxToken <..> -projectName <...> -enableOsa -OsaLocationPath <lib_folder> -OsaJson <output_folder>

That will generate three files, two of which are needed for defectdojo. Build the file for defectdojo with the jq utility: jq -s . CxOSAVulnerabilities.json CxOSALibraries.json

Choctaw Hog parser

From: https://github.com/newrelic/rusty-hog Import the JSON output.

Cloudsploit (AquaSecurity)

From: https://github.com/aquasecurity/cloudsploit . Import the JSON output.

CycloneDX

CycloneDX is a lightweight software bill of materials (SBOM) standard designed for use in application security contexts and supply chain component analysis.

From: https://www.cyclonedx.org/

Example with Anchore Grype:

./grype defectdojo/defectdojo-django:1.13.1 -o cyclonedx > report.xml

Example with cyclonedx-bom tool:

pip install cyclonedx-bom
cyclonedx-py
  Usage:  cyclonedx-py [OPTIONS]
  Options:
    -i <path> - the alternate filename to a frozen requirements.txt
    -o <path> - the bom file to create
    -j        - generate JSON instead of XML

DawnScanner

Import report in JSON generated with -j option

Dependency Check

OWASP Dependency Check output can be imported in Xml format. This parser ingests the vulnerable dependencies and inherits the suppressions.

  • Suppressed vulnerabilities are tagged with the tag: suppressed.
  • Suppressed vulnerabilities are marked as inactive, but not as mitigated.
  • If the suppression is missing any <notes> tag, it tags them as no_suppression_document.
  • Related vulnerable dependencies are tagged with related tag.

Dependency Track

Dependency Track has implemented a DefectDojo integration. Information about how to configure the integration is documented here: https://docs.dependencytrack.org/integrations/defectdojo/

Alternatively, the Finding Packaging Format (FPF) from OWASP Dependency Track can be imported in JSON format. See here for more info on this JSON format: https://docs.dependencytrack.org/integrations/file-formats/

DrHeader

Import of JSON report from https://github.com/Santandersecurityresearch/DrHeader

Dockle Report

Import JSON container image linter reports https://github.com/goodwithtech/dockle

Detect-secrets

Import of JSON report from https://github.com/Yelp/detect-secrets

ESLint

ESLint Json report format (-f json)

Fortify

Import Findings from XML file format.

Generic Findings Import

Import Generic findings in CSV or JSON format.

Attributes supported for CSV:

  • Title
  • Description
  • Date
  • Severity
  • Duplicate (‘TRUE’, ‘FALSE’)
  • Active (‘TRUE’, ‘FALSE’)
  • Mitigation
  • Impact
  • References
  • Verified (‘TRUE’, ‘FALSE’)
  • FalsePositive
  • CVE
  • CweId
  • CVSSV3
  • Url

Example of JSON format:

{
    "findings": [
        {
            "title": "test title with endpoints as dict",
            "description": "Some very long description with\n\n some UTF-8 chars à qu'il est beau",
            "severity": "Medium",
            "mitigation": "Some mitigation",
            "date": "2021-01-06",
            "cve": "CVE-2020-36234",
            "cwe": 261,
            "cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
            "file_path": "src/first.cpp",
            "line": 13,
            "endpoints": [
                {
                    "host": "exemple.com"
                }
            ]
        },
        {
            "title": "test title with endpoints as strings",
            "description": "Some very long description with\n\n some UTF-8 chars à qu'il est beau2",
            "severity": "Critical",
            "mitigation": "Some mitigation",
            "date": "2021-01-06",
            "cve": "CVE-2020-36235",
            "cwe": 287,
            "cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
            "file_path": "src/two.cpp",
            "line": 135,
            "endpoints": [
                "http://urlfiltering.paloaltonetworks.com/test-command-and-control",
                "https://urlfiltering.paloaltonetworks.com:2345/test-pest"
            ]
        },
        {
            "title": "test title",
            "description": "Some very long description with\n\n some UTF-8 chars à qu'il est beau2",
            "severity": "Critical",
            "mitigation": "Some mitigation",
            "date": "2021-01-06",
            "cve": "CVE-2020-36236",
            "cwe": 287,
            "cvssv3": "CVSS:3.1/AV:N/AC:L/PR:H/UI:R/S:C/C:L/I:L/A:N",
            "file_path": "src/threeeeeeeeee.cpp",
            "line": 1353
        }
    ]
}

Gosec Scanner

Import Gosec Scanner findings in JSON format.

Gitleaks

Import Gitleaks findings in JSON format.

GitLab SAST Report

Import SAST Report vulnerabilities in JSON format: https://docs.gitlab.com/ee/user/application_security/sast/#reports-json-format

GitLab Dependency Scanning Report

Import Dependency Scanning Report vulnerabilities in JSON format: https://docs.gitlab.com/ee/user/application_security/dependency_scanning/#reports-json-format

Github Vulnerability

Import findings from Github vulnerability scan: https://help.github.com/en/github/managing-security-vulnerabilities

Currently the parser is able to manage only RepositoryVulnerabilityAlert object. The parser has some kind of search feature which detect the data in the report.

Here is the mandatory objects and attributes:

vulnerabilityAlerts (RepositoryVulnerabilityAlert object)
    + id
    + createdAt (optional)
    + vulnerableManifestPath (optional)
    + securityVulnerability (SecurityVulnerability object)
        + severity (CRITICAL/HIGH/LOW/MODERATE)
        + package (optional)
            + name (optional)
        + advisory (SecurityAdvisory object)
            + description
                + summary
                + description
                + identifiers
                    + value
                + references (optional)
                    + url (optional)
                + cvss (optional)
                    + vectorString (optional)

References:

Github v4 graphql query to fetch data:

    query getVulnerabilitiesByOwner($owner: String!) {
    search(query: $owner, type: REPOSITORY, first: 100) {
      nodes {
        ... on Repository {
          name
          vulnerabilityAlerts(last: 100) {
            nodes {
              id
              securityVulnerability {
                severity
                package {
                  name
                }
                advisory {
                  description
                  summary
                  identifiers {
                    type
                    value
                  }
                  references {
                    url
                  }
                }
              }
            }
          }
        }
      }
    }
  }

Another example of Python script that query one repository:


import json
import requests


query = """
query getVulnerabilitiesByRepoAndOwner($name: String!, $owner: String!) {
  repository(name: $name, owner: $owner) {
    vulnerabilityAlerts(first: 100) {
      nodes {
        id
        createdAt
        securityVulnerability {
          severity
          package {
            name
            ecosystem
          }
          advisory {
            description
            summary
            identifiers {
              value
              type
            }
            references {
              url
            }
            cvss {
              vectorString
            }
          }
        }
        vulnerableManifestPath
      }
    }
  }
}
"""

token = '...' # generated from GitHub settings
headers = {"Authorization": "Bearer " + token}


request = requests.post(url='https://api.github.com/graphql',
                        json={
                          "operationName": "getVulnerabilitiesByRepoAndOwner",
                          'query': query,
                          'variables': {
                            'name': 'gogoph',
                            'owner': 'damiencarol'
                          }
                        },
                        headers=headers)

result = request.json()
print(json.dumps(result, indent=2))

Hadolint

Hadolint Dockerfile scan in json format.

Harbor Vulnerability

Import findings from Harbor registry container scan: https://github.com/goharbor/harbor

HuskyCI Report

Import JSON reports from HuskyCI

IBM AppScan DAST

XML file from IBM App Scanner.

Immuniweb Scan

XML Scan Result File from Immuniweb Scan.

IntSights Report

IntSights Threat Command is a commercial Threat Intelligence platform that monitors both the open and dark web to identify threats for the Assets you care about (Domain Names, IP addresses, Brand Names, etc.).

Manual Import

Use the Export CSV feature in the IntSights Threat Command GUI to create an IntSights Alerts.csv file. This CSV file can then be imported into Defect Dojo.

Automated Import

The IntSights get-complete-alert API only returns details for a single alert. To automate the process, individually fetch details for each alert and append to a list. The list is then saved as the value for the key “Alerts”. This JSON object can then be imported into Defect Dojo.

Example:

{
   "Alerts":[
      {
         "_id":"5c80egf83b4a3900078b6be6",
         "Details":{
            "Source":{
               "URL":"https://www.htbridge.com/websec/?id=ABCDEF",
               "Date":"2018-03-08T00:01:02.622Z",
               "Type":"Other",
               "NetworkType":"ClearWeb"
            },
           "Images":[
              "5c80egf833963a40007e01e8d",
              "5c80egf833b4a3900078b6bea",
              "5c80egf834626bd0007bd64db"
           ],
           "Title":"HTTP headers weakness in example.com web server",
           "Tags":[],
           "Type":"ExploitableData",
           "Severity":"Critical",
           "SubType":"VulnerabilityInTechnologyInUse",
           "Description":"X-XSS-PROTECTION and CONTENT-SECURITY-POLICY headers were not sent by the server, which makes it vulnerable for various attack vectors"
        },
        "Assignees":[
           "5c3c8f99903dfd0006ge5e61"
        ],
        "FoundDate":"2018-03-08T00:01:02.622Z",
        "Assets":[
           {
              "Type":"Domains",
              "Value":"example.com"
           }
        ],
        "TakedownStatus":"NotSent",
        "IsFlagged":false,
        "UpdateDate":"2018-03-08T00:01:02.622Z",
        "RelatedIocs":[],
        "RelatedThreatIDs":[],
        "Closed":{
           "IsClosed":false
        }
     }
  ]
}

JFrogXRay

Import the JSON format for the "Security Export" file. Use this importer for Xray version 2.X

JFrog XRay Unified

Import the JSON format for the "Security & Compliance | Reports" export. Jfrog’s Xray tool is an add-on to their Artifactory repository that does Software Composition Analysis, see https://www.jfrog.com/confluence/display/JFROG/JFrog+Xray for more information. "Xray Unified" refers to Xray Version 3.0 and later.

Kiuwan Scanner

Import Kiuwan Scan in CSV format. Export as CSV Results on Kiuwan.

kube-bench Scanner

Import JSON reports of Kubernetes CIS benchmark scans.

KICS Scanner

Import of JSON report from https://github.com/Checkmarx/kics

Meterian Scanner

The Meterian JSON report output file can be imported.

Microfocus Webinspect Scanner

Import XML report

MobSF Scanner

Export a JSON file using the API, api/v1/report_json.

Mobsfscan

Import JSON report from https://github.com/MobSF/mobsfscan

Mozilla Observatory Scanner

Import JSON report.

Nessus (Tenable)

Reports can be imported in the CSV, and .nessus (XML) report formats.

Nessus WAS (Tenable)

Reports can be imported in the CSV, and .nessus (XML) report formats.

Netsparker

Vulnerabilities List - JSON report

Nexpose XML 2.0 (Rapid7)

Use the full XML export template from Nexpose.

Nikto

Nikto web server scanner - https://cirt.net/Nikto2

The current parser support 3 sources:

  • XML output (old)
  • new XML output (with nxvmlversion="1.2" type)
  • JSON output

See: https://github.com/sullo/nikto

Nmap

XML output (use -oX)

Node Security Platform

Node Security Platform (NSP) output file can be imported in JSON format.

NPM Audit

Node Package Manager (NPM) Audit plugin output file can be imported in JSON format. Only imports the 'advisories' subtree.

Nuclei

Import JSON output of nuclei scan report https://github.com/projectdiscovery/nuclei

Openscap Vulnerability Scan

Import Openscap Vulnerability Scan in XML formats.

OpenVAS CSV

Import OpenVAS Scan in CSV format. Export as CSV Results on OpenVAS.

OssIndex Devaudit

Import JSON formatted output from [OSSIndex Devaudit](https://github.com/sonatype-nexus-community/DevAudit).

Oss Review Toolkit

Import ORT Evaluated model reporter in JSON Format. (Example)[https://github.com/DefectDojo/sample-scan-files/blob/master/ort/evaluated-model-reporter-output.json]

PHP Security Audit v2

Import PHP Security Audit v2 Scan in JSON format.

PHP Symfony Security Checker

Import results from the PHP Symfony Security Checker.

Probely

Synchronize Probely Plus findings with DefectDojo.

To setup this integration set the DefectDojo URL and API key on the Integrations page on Probely. Then, select which Product, Engagement, and, optionally, the Test you want to synchronize to. The API key needs to belong to a staff user.

Works with DefectDojo 1.5.x and 1.6.x. Probely also supports non-public DefectDojo instances.

For detailed instructions on how to configure Probely and DefectDojo, see https://help.probely.com/en/articles/3811515-how-to-integrate-probely-with-defectdojo

Qualys Scan

Qualys output files can be imported in API XML format. Qualys output files can be imported in WebGUI XML format.

Qualys Webapp Scan

Qualys WebScan output files can be imported in XML format.

Retire.js

Retire.js JavaScript scan (--js) output file can be imported in JSON format.

Risk Recon API Importer

Import findings from Risk Recon via the API. Configure your own JSON report as follows

{
    "url_endpoint": "https://api.riskrecon.com/v1",
    "api_key": "you-api-key",
    "companies": [
        {
            "name": "Company 1",
            "filters": {
                "domain_name": [],
                "ip_address": ["127.0.0.1"],
                "host_name": ["localhost"],
                "asset_value": [],
                "severity": ["critical", "high"],
                "priority": [],
                "hosting_provider": [],
                "country_name": []
            }
        },
        {
            "name": "Company 2",
            "filters": {
                "ip_address": ["0.0.0.0"]
            }
        }

    ],
    "filters": {
        "domain_name": [],
        "ip_address": [],
        "host_name": [],
        "asset_value": [],
        "severity": ["critical"],
        "priority": [],
        "hosting_provider": [],
        "country_name": []
    }
}
  • More than one company finding list can be queried with it's own set of filters. Company 1 shows all available fitlers, while Company 2 shows that empty filters need not be present.
  • To query all companies in your Risk Recon instance, simple remove the "companies" field entirely.
  • If the "companies" field is not present, and filtering is still requested, the "filters" field can be used to filter all findings across all companies. It carries the same behavior as the company filters. The "filters" field is disregarded in the prescense of the "companies" field.
  • Removing both fields will allow retrieval of all findings in the Risk Recon instance.

Safety Scan

Safety scan (--json) output file can be imported in JSON format.

SARIF

OASIS Static Analysis Results Interchange Format (SARIF). SARIF is supported by many tools. More details about the format here: https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=sarif

ScoutSuite

Multi-Cloud security auditing tool. It uses APIs exposed by cloud providers. Scan results are located at scan-reports/scoutsuite-results/scoutsuite\_\*.json files. Multiple scans will create multiple files if they are runing agains different Cloud projects. See https://github.com/nccgroup/ScoutSuite

Semgrep JSON Report

Import Semgrep output (–json)

SKF Scan

Output of SKF Sprint summary export.

Snyk

Snyk output file (snyk test --json > snyk.json) can be imported in JSON format.

SonarQube Scan (Aggregates findings per cwe, title, description, file_path.)

SonarQube output file can be imported in HTML format.

To generate the report, see https://github.com/soprasteria/sonar-report

Version: >= 1.1.0

SonarQube Scan Detailed (Import all findings from SonarQube html report.)

SonarQube output file can be imported in HTML format.

To generate the report, see https://github.com/soprasteria/sonar-report

Version: >= 1.1.0

SonarQube API Import

SonarQube API will be accessed to gather the report. No report file required.

Follow the below steps to setup API Import:

  1. Configure the Sonarqube Authentication details by navigating to Configuration->Tool Configuration. Note the url should be in the formation of <http(s)://>\<sonarqube\_hostname\>/api. Select the tool type to SonarQube. By default the tool will import vulnerabilities only, but additional filters can be setup using the Extras field separated by commas (e.g. BUG,VULNERABILITY,CODE_SMELL)
  2. In the Product settings fill the details for the SonarQube Project Key (Key name can be found by navigating to a specific project and selecting the value from the url <http(s)://>\<sonarqube\_host\>/dashboard?id=\<key\_name\>. In case you will not provide SonarQube Project Key, DefectDojo will try to use Product as Project name in SonarQube. If you would like to collect findings from multiple projects you can specify multiple Keys as separated SonarQube Configuration in the Product settings.
  3. Once all of the above settings are made, the API Import should be able to auto-import all vulnerability information from the SonarQube instance. During setting import, you can select which SonarQube Configuration (which Project key) should be used. If you will not choose any, DefectDojo will try to do the best guess :) (if you defined only one Product SonarQube Configuration or only one SonarQube Tool Configuration).

NOTE: If https is in use for the SonarQube than certificate should be trusted by DD instance.

SpotBugs

XML report of textui cli.

Sonatype

JSON output.

SSL Labs

JSON Output of ssllabs-scan cli.

Sslscan

Import XML output of sslscan report.

Sslyze Scan

XML report of SSLyze version 2 scan

SSLyze 3 Scan (JSON)

JSON report of SSLyze version 3 scan

Testssl Scan

Import CSV output of testssl scan report.

Terrascan

Import JSON output of terrascan scan report https://github.com/accurics/terrascan

Trivy

JSON report of trivy scanner.

Trufflehog

JSON Output of Trufflehog.

Trustwave

CSV output of Trustwave vulnerability scan.

Twistlock

JSON output of the twistcli tool. Example:

./twistcli images scan <REGISTRY/REPO:TAG> --address https://<SECURE_URL_OF_TWISTLOCK_CONSOLE> --user <USER> --details --output-file=<PATH_TO_SAVE_JSON_FILE>

The CSV output from the UI is now also accepted.

TFSec

Import of JSON report from https://github.com/tfsec/tfsec

Visual Code Grepper (VCG)

VCG output can be imported in CSV or Xml formats.

Veracode

Detailed XML Report

Wapiti Scan

Import XML report.

Whitesource Scan

Import JSON report

Wpscan Scanner

Import JSON report.

Wfuzz JSON importer

Import the result of Wfuzz (https://github.com/xmendez/wfuzz) if you export in JSON the result (wfuzz -o json -f myJSONReport.json,json).

The return code matching are directly put in Severity as follow(this is hardcoded in the parser actually).

HTTP Return CodeSeverity
200High
401Medium
403Medium
407Medium
500Low

Xanitizer

Import XML findings list report, preferably with parameter 'generateDetailsInFindingsListReport=true'.

Yarn Audit

Import Yarn Audit scan report in JSON format. Use something like yarn audit --json > yarn_report.json.

Zed Attack Proxy

ZAP XML report format.

Import and reimport in DefectDojo

The importers analyze each report and create new Findings for each item reported. DefectDojo collapses duplicate Findings by capturing the individual hosts vulnerable.

Import Form

Additionally, DefectDojo allows for re-imports of previously uploaded reports. DefectDojo will attempt to capture the deltas between the original and new import and automatically add or mitigate findings as appropriate.

Re-Import Form

Bulk import via CSV

Bulk import of findings can be done using a CSV file with the following column headers:

Date
Date of the finding in mm/dd/yyyy format.
Title:
Title of the finding
CweId
Cwe identifier, must be an integer value.
Url:
Url associated with the finding.
Severity:
Severity of the finding. Must be one of Info, Low, Medium, High, or Critical.
Description:
Description of the finding. Can be multiple lines if enclosed in double quotes.
Mitigation:
Possible Mitigations for the finding. Can be multiple lines if enclosed in double quotes.
Impact:
Detailed impact of the finding. Can be multiple lines if enclosed in double quotes.
References:
References associated with the finding. Can be multiple lines if enclosed in double quotes.
Active:
Indicator if the finding is active. Must be empty, True or False
Verified:
Indicator if the finding has been verified. Must be empty, True, or False
FalsePositive:
Indicator if the finding is a false positive. Must be True, or False.
Duplicate:
Indicator if the finding is a duplicate. Must be True, or False.