Skip to main content

Data Integration

This article details the process of getting your data into the Brinqa Platform through connectors.

What is data integration?

Data integration in the Brinqa Platform refers to the initial step where data is collected from various sources, such as security tools, vulnerability scanners, and other IT systems. During this phase, Brinqa connects to these sources using connectors to access and retrieve data.

Brinqa sync process diagram

Figure 1. The sync process in data integration.


After data integration, you must also perform data orchestration to consolidate, compute, and prepare your data for visualization and searching, on both the list views and the Explorer graph. Without data orchestration, your data is not ready for validation or viewing.

Create a data integration

Data integration involves configuring and authenticating these connectors with Brinqa, indicating how far back in time you want to retrieve your data, and how long you want to retain the source data in Brinqa. If you have already configured a data integration and are comfortable with the steps in creating data integrations, you can skip to run the data integration.

Only users with the System Administrator role can create data integrations. To create a new data integration, follow these steps:

  1. Navigate to Integrations > Sources and click Create.

    • You can also navigate to Integrations > Connectors and click Use on the connector after it is installed. The Connector field in the data integration information section is filled out with the connector you have selected.
  2. Fill out the general information:

    • Title: Title of your data integration.

    • Connector: The connector to use in the data integration.

    • Server: The server to process data in the data integration. Local server is selected by default for cloud data sources. You can also create your own data servers.

    • Description: Description of the data integration. For example, a list of the services or data it provides.

  3. Complete the required authentication settings for the connector. Specific fields may differ based on the connector:

    • Server URL: The Uniform Resource Locator (URL) to access the data source.

    • Username and Password: The account credentials for the specific connector account, which must have permissions to access the vendor API and return data.

    • API Keys/Tokens: Access codes used to authenticate the connector with Brinqa. Certain connectors have configuration settings that replace the Username and Password fields with API keys. For example, the connector requires an Access Key and a Secret Key.


    If you are unsure of the correct server URL, credentials, or API keys for the connector, refer to the specific connector documentation or contact an administrator.

  4. (Optional) Some connectors contain additional options for specific configuration. For example, you can set the page size (maximum number of records to get per API request), modify the number of parallel requests (maximum number of parallel API requests), or skip certificate verification.

  5. Click Next to test the connection and save the configuration.

    The connector tries to access the data sources as specified. If the connection settings are correct, more options display.

  6. In Types, you can view the types of data the connector retrieves. For example, the Qualys Vulnerability Management connector brings in data for Host, Vulnerability, and Vulnerability Definition.

    • Click the entry to see the attributes associated with the data model. For example, the screenshot below shows some of the attributes the Vulnerability data model provides from the source:

    connector attributes

  7. (Optional) Configure operation options for the data model. See connector operation options for additional information.

  8. Set the sync interval. Sync interval determines how far back you want to sync your data. The default setting is the beginning of time, which refers to the Unix epoch of January 1st, 1970 at 00:00:00 UTC. This essentially means when your Brinqa Platform is originally deployed. Click the drop-down to select from a range of options.


    Brinqa recommends that you use the beginning of time option when you run the data integration initially to import all your data thus far. After the first sync, change the sync interval to the last sync to save time and resources.

  9. (Optional) Set a data lifecycle management policy for the integration. See data lifecycle management for additional information.

  10. Click Create.

If the connection is successful, the page reloads and you should see your new data integration listed. If you do not see it, click Refresh.

If the connection is not successful, the data integration is not created. Double-check the authentication credentials to ensure that they are correct.

Connector operation options

Some connectors support operation options, which are used by the connector to build filters when retrieving data. Operation options can impact the amount of time for data integration to complete and help find more targeted data. Operation options are connector-specific and can only be applied if the vendor API supports them. See individual connector documentation for more information.

You can define operation options for the type of data you want the connector to retrieve. For example, in step 6 of the previous instructions, follow these steps to limit the Vulnerability data to active vulnerabilities:

  • In Types, without expanding the attributes, hold the pointer over Vulnerability and click Options. A new window appears.

  • For Key, type status; for Value, type active. Operation options are case-sensitive.

  • Click Save.

Take some time to get familiar with the different data models, attributes, and connector option keys for your connector and decide which ones benefit you in your data integration. You may want to add more attributes or set certain operation options to ensure your system is populated with the data you want to bring in.

Data lifecycle management

Data lifecycle management (DLM) provides a structured method to help maintain your data in the Brinqa Platform. It enables you to designate data as inactive if it hasn't been updated within a set timeframe. Inactive data can then be scheduled for automatic purging, ensuring that your system contains only relevant and current data. DLM policies can help eliminate outdated information, which can declutter the system, speed up response times, and ultimately lead to more efficient and accurate analysis of your data.

The following diagram illustrates the concept of DLM:

DLM diagram

Figure 2. DLM feature workflow

Consider the following scenario: You have created two data integrations that bring in Host records, specifically from and Qualys Vulnerability Management (VM). The source data from these integrations feed into a single consolidated Host record in the Brinqa Platform (see Data consolidation for additional information).

The following diagram illustrates the process.

Qualys and integration example

Figure 3. Multiple source data consolidated into a unified data model.

In addition, let's say you've configured these integrations to have a DLM policy to mark hosts as inactive after 30 days of no updates and then to purge them 30 days later, resulting in a total of 60 days from inactivity to removal. The process works as follows:

  • Should the integration not update the Host record for 30 days, that source data coming in from is labeled as inactive. However, the consolidated Host record remains active as long as it continues to receive fresh source data from the Qualys VM integration. And the reverse is true for the Qualys VM integration.

  • If neither nor Qualys VM provide updates within a 30-day window, both the source data and the consolidated Host record are tagged as inactive. They are then scheduled for removal after the designated period, provided that no other source has been consolidated into this Host record.

This approach ensures that the Brinqa Platform manages your data by marking it as inactive based on specified periods of inactivity or other conditions, allowing for the removal of data that is no longer being updated.

When configuring a data integration, you have the option to enable a DLM policy for each type of data you try to bring in. The options are located in the Data lifecycle section, accessible when either creating a new integration or editing an existing one. These options include:

  • Mark data inactive (days): Select the appropriate duration for data to be marked as inactive if not updated. Options include: 1, 5, 10, 30, 60, 90, 120, 180, or 365 days.

  • Delete inactive data after (days): Select the appropriate duration after which the inactive data will be deleted. Options include: 1, 5, 10, 30, 60, 90, 120, 180, or 365 days.

  • Advanced settings: (Optional) You can use a specific condition to determine when data should be marked as inactive. The default attribute used to determine inactivity is lastSynced, but you can enter a BCL condition to specify a different attribute to define activity.

    • You must make sure that the attributes used in the condition exist on the object that you are configuring. For example, when setting a lifecycle policy for the Host object, use attributes that are associated with Host. You can find the available attributes for each object from the Types section, as shown in step 6 of the create a data integration steps.


      Brinqa recommends starting out by using the default DLM policy provided with the data integration. Using the advanced feature can have unintended consequences. Brinqa provides you with recommended practices and policies for DLM, including the default number of inactivity days and deletion days. If you want to use the advanced feature, the option is available, but we highly recommend consulting your Brinqa Support specialists beforehand.

To help determine the status of your data under DLM policies, the following attributes provide visibility into when they are marked as inactive and purged, based on the configured DLM policies. You can see the values for these attributes in the Detail or Show view of any data model that supports a DLM policy:

  • Lifecycle inactive date: Indicates the date that the dataset is marked as inactive based on the DLM policy.

  • Lifecycle purge date: Indicates the date that the dataset is purged based on the DLM policy.

  • Lifecycle status: Indicates the current status of the dataset based on the DLM policy.

    For additional information about how the Brinqa Platform determines the status of your assets and findings, see Status Configuration.

The following screenshot illustrates what a DLM configuration may resemble:

Data lifecycle management config


You may encounter a message stating, "This data integration contains types that do not support configuration of a data lifecycle policy." This message appears because not all data types are suitable for lifecycle management due to their nature. Certain data types, such as vulnerability definitions, do not adhere to a lifecycle policy, as their validity is not time-sensitive.

Run the data integration

Now that you have created your data integration, you can start running syncs to gather data. By default, the data integration runs once a day automatically through data orchestration, but you can sync your data manually as well. A successful sync ensures that your connector credentials are correct, the appropriate permissions to make API calls and return data are applied, and the data retrieved are valid. Data integration does not consolidate your data or calculate risk scores. Data orchestration and data integration work together to get your data ready for searching and visualization.

Hold the pointer over the data integration until you see a few options appear on the right-hand side. Click Sync to manually sync and run your data integration. You can also perform a manual sync of your data by clicking Sync on the data integration details page. Both methods run the data integration sync to bring your data into the Brinqa Platform. There are options to select how far back you want to sync your data. See Step 8 in Create a data integration for more information.

The time it takes for the sync to complete depends on a multitude of factors. For example, the number of data models receiving data, the number of attributes, the number of filters, or the amount of data coming in. Therefore, the sync can take a few minutes or up to a few hours.

If you have multiple data integrations, it is recommended that you do not run syncs concurrently. Allow one sync to finish before performing further syncs.


Make sure you also run data orchestration before you begin to validate your data.

Edit or delete a data integration


Proceed with caution when deleting a data integration. This action removes all source data exclusively linked to that integration. Specifically, Unified Data Models (UDMs) that rely solely on the data integration you are deleting will also be removed. Additionally, dependent datasets and requests, including exceptions or remediation tasks, may be affected. To prevent unintended data loss, carefully review all potentially impacted areas prior to deletion.

Navigate to Integrations. Hold the pointer over the data integration until you see a few options appear on the right-hand side. These options are Sync, Show, Edit, and a kebab (three vertical dots) menu. The kebab menu contains options to configure the mapping (if applicable), test the connection, delete the data integration, or cancel the data integration sync if it is running.

After initiating the deletion of a data integration, it is not immediately removed. Instead, the data integration is marked for deletion. The process takes 60 minutes to complete. During this time, you can continue to create new data integrations and run syncs.

Troubleshooting tips

This section outlines some troubleshooting tips to help you identify and resolve some common issues you may come across when creating and running data integrations.

Incorrect API or server URL

The data integration can fail if you use an incorrect API or server URL. An incorrect URL prevents the connector from reaching the appropriate server, which can lead to connection errors.

For example, the correct API URL for the Semgrep connector is If you use an incorrect API URL, such as https://semgrep-dev, the system may fail to connect and return an error message indicating a "Temporary failure in name resolution". This means that the provided URL does not point to a valid server address. Below is an example of this error message:

Incorrect API URL for Semgrep

To resolve this error, follow these steps:

  1. Verify the correct API or server URL from the individual connector documentation.

  2. Double-check the URL provided in the integration connection settings for any mistakes:

    • Pay special attention to common errors like misplaced dots, hyphens, or unnecessary spaces.

    • Ensure that you're using the correct protocol (http versus https).

  3. Replace the incorrect URL with the verified one in your integration configuration.

  4. Click Next to save the changes and confirm that the connection is successful.

Inaccurate authentication credentials

The data integration can fail due to inaccurate or expired credentials such as API or access keys, or user authentication details such as emails, usernames, or passwords. These credentials are crucial for the Brinqa connectors to authenticate and communicate with the external source.

For example, using an incorrect API token for the LeanIX EAM connector results in an error message with a 401 Unauthorized status. This status indicates that the system could not authenticate your credentials, specifically during the OAuth 2.0 token exchange, as suggested by the /oauth2/token endpoint in the error message. Below is an example of such an error message:

Inaccurate API key error message

The 401 Unauthorized error is a standard HTTP response indicating that the provided credentials are not valid for the requested resource. If you encounter this error, check your API token or other authentication credentials for any inaccuracies, particularly ensuring that the token is current and was entered correctly.

To resolve such issues, follow these steps:

  1. Confirm the validity of the authentication credentials with the service provider. If you're not sure how to obtain or generate these credentials, refer to the individual connector documentation.

  2. Double-check the credentials entered in the integration connection settings for any inaccuracies:

    • Ensure that there are no leading or trailing spaces, and that the correct cases are used.
  3. Re-enter the API key, access key, email, username, or password without any typos or misplaced characters.

  4. Click Next to save the changes and confirm that the connection is successful.

Insufficient permissions

The integration can fail because the account associated with the provided authentication credentials does not have the necessary permissions to access the API server and return data. To resolve this, follow these steps:

  1. Verify the access level of the account associated with the API key or authentication credentials. If you're not sure what permissions are required, refer to the individual connector documentation.

  2. If necessary, modify the account permissions or use an account with the appropriate level of access:

    • Consult with your system administrator to adjust the account permissions.

    • Update the integration configuration with the account that has the required access.

  3. Click Next to save your changes.

  4. Run the integration again to verify that the issue is resolved.

    • See how to validate your data for additional information on how to verify that the necessary permissions have been applied to retrieve the proper data.

Outdated connector version in the configuration

The data integration may not function as expected if the connector in your integration is outdated. Brinqa regularly updates connectors to introduce new functionalities and address any errors. Running an outdated version of a connector means missing out on these important updates, which can potentially lead to issues.

For example, consider you have an integration with the GitHub connector, and a recent update has been released to address a specific issue. If your integration still references an older version of the GitHub connector, the issue will persist until you update to the latest version and select the new version in the integration configuration settings for the changes to take effect:

Correct connector version

To ensure you benefit from the latest fixes and improvements, follow these steps:

  1. Update the connector to the newest available version.

  2. Navigate to the integration configuration.

  3. Click the connector drop-down and select the most recent version.

  4. Click Next to save the changes.

  5. Run the integration to verify that the issue is resolved.


If you need to use a previous version of a connector, please contact Brinqa Support.

Contact Brinqa support

If you still have issues with your data integrations failing or not connecting properly after following the troubleshooting tips, consider these options:

  • Use the support portal to reach Brinqa Support specialists.

  • Email Brinqa Support at When you do, be prepared to provide details about the issue, including which connector and version you're using, your Brinqa Platform version, and any steps you have already taken in attempting to resolve your issue. Providing more information will help Brinqa Support assist you more efficiently.