Integration options¶
Throughout the portal you can focus on your preferred integration approach (and switch between them as you like):
Atlan University
See it in action in our automated enrichment course (45 mins).
You can use dbt's meta
field to enrich metadata resources from dbt into Atlan. Atlan will ingest the information from this field and update the assets in Atlan accordingly.
With this, you have a powerful way to keep the dbt assets documented directly as part of your dbt work.
The following is an example:
dbt example | |
---|---|
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
|
- The
description
at the top level of an asset defined in dbt will already be mapped to the description field for that asset in Atlan. - More detailed metadata, however, needs to be specified within the
meta
field. - ... and within the
meta
field, further within theatlan
sub-field. - For attributes, such as certificates, announcements, or owners these need to be specified within the
attributes
sub-field. - Classifications need to be specified within a
classifications
sub-field. - Note that the
meta
field and its sub-structure (including all the detailed attributes) can also be applied to columns within a model.
This rich metadata will then be loaded to the corresponding attributes on the asset in Atlan.
For more details on specific examples, see the dbt tabs in the Common asset actions snippets.
Atlan University
Walk through step-by-step in our intro to custom integration course (30 mins).
Obtain the SDK¶
The SDK is available on Maven Central, ready to be included in your project:
repositories {
mavenCentral()
}
dependencies {
implementation "com.atlan:atlan-java:+" // (1)
testRuntimeOnly 'ch.qos.logback:logback-classic:1.2.11' // (2)
}
- Include the latest version of the Java SDK in your project as a dependency. You can also give a specific version instead of the
+
, if you'd like. - The Java SDK uses slf4j for logging purposes. You can include logback as a simple binding mechanism to send any logging information out to your console (standard out).
<dependency>
<groupId>com.atlan</groupId>
<artifactId>atlan-java</artifactId>
<version>${atlan.version}</version>
</dependency>
Configure the SDK¶
Set two values on the static Atlan
class:
Atlan.setBaseUrl()
should be given your Atlan URL (for example,https://tenant.atlan.com
)Atlan.setApiToken()
should be given your Atlan API token , for authentication (don't forget to assign one or more personas to the API token to give access to existing assets!)
That's it — once these are set, you can start using your SDK to make live calls against your Atlan instance! 🎉
Set the base URL first
Since version 0.9.0 of the Java SDK, you must call setBaseUrl()
before calling setApiToken()
.
Here's an example of setting these based on environment variables:
import com.atlan.Atlan;
public class AtlanLiveTest {
static {
Atlan.setBaseUrl(System.getenv("ATLAN_BASE_URL"));
Atlan.setApiToken(System.getenv("ATLAN_API_KEY"));
}
}
What's next?¶
Atlan University
Walk through step-by-step in our intro to custom integration course (30 mins).
Obtain the SDK¶
The SDK is currently available on pypi. You can use pip to install it as follows:
pip install pyatlan
Configure the SDK¶
There are two ways to configure the SDK:
Using environment variables¶
ATLAN_API_KEY
should be given your Atlan API token , for authentication (don't forget to assign one or more personas to the API token to give access to existing assets!)ATLAN_BASE_URL
should be given your Atlan URL (for example,https://tenant.atlan.com
)
Here's an example of setting those environment variables:
export ATLAN_BASE_URL=https://tenant.atlan.com
export ATLAN_API_KEY="..."
atlan_live_test.py | |
---|---|
1 2 3 |
|
On client creation¶
If you prefer to not use environment variables, you can do the following:
atlan_live_test.py | |
---|---|
1 2 3 4 5 6 |
|
Careful not to expose your API token!
We generally discourage including your API token directly in your code, in case you accidentally commit it into a (public) version control system. But it's your choice exactly how you manage the API token and including it for use within the client.
That's it — once these are set you can start using your SDK to make live calls against your Atlan instance! 🎉
What's next?¶
Obtain the SDK¶
For Kotlin, you can reuse the existing Java SDK as-is. It is available on Maven Central, ready to be included in your project:
repositories {
mavenCentral()
}
dependencies {
implementation "com.atlan:atlan-java:+" // (1)
implementation("io.github.microutils:kotlin-logging-jvm:3.0.5") // (2)
implementation("org.slf4j:slf4j-simple:2.0.7")
}
- Include the latest version of the Java SDK in your project as a dependency. You can also give a specific version instead of the
+
, if you'd like. - The Java SDK uses slf4j for logging purposes. You can include slf4j-simple as a simple binding mechanism to send any logging information out to your console (standard out), along with the
kotlin-logging-jvm
microutil.
Configure the SDK¶
Set two values on the static Atlan
class:
Atlan.setBaseUrl()
should be given your Atlan URL (for example,https://tenant.atlan.com
)Atlan.setApiToken()
should be given your Atlan API token , for authentication (don't forget to assign one or more personas to the API token to give access to existing assets!)
That's it — once these are set, you can start using your SDK to make live calls against your Atlan instance! 🎉
Set the base URL first
Since version 0.9.0 of the Java SDK, you must call setBaseUrl()
before calling setApiToken()
.
Here's an example of setting these based on environment variables:
import com.atlan.Atlan;
fun main() {
Atlan.setBaseUrl(System.getenv("ATLAN_BASE_URL"));
Atlan.setApiToken(System.getenv("ATLAN_API_KEY"));
}
What's next?¶
Atlan produces events when certain activities occur in the system. You can tap into these in a push-based integration model to take action the moment they occur.
To tap into the events, you need to first set up a webhook in Atlan.
Have a look at the event handling pattern for more details on implementing event-handling from webhooks using either of the SDKs.
Atlan University
Walk through step-by-step in our intro to custom integration course (30 mins).
Not for the faint of heart
There are a number of details and nuances to understand about the underlying REST APIs to use them most effectively.
Ultimately, all pull-based integration mechanisms (including the SDKs) use the REST API; however, the SDKs also encode best practices to avoid the need to understand all these details and low-level nuances. So if you want to adopt these best practices from the start, we would strongly recommend directly using either Java or Python, rather than the raw REST APIs directly.
That being said, we have documented the raw REST API interactions in most cases. So if you really want to interact with the APIs directly, there should still be some guidance — anywhere you see Raw REST API gives details on endpoint URLs, methods, and payloads.
You can use the REST API directly, if you're willing to learn all the nuances in:
- the underlying REST API communications (HTTP protocols, methods, headers, endpoint URLs, response codes, etc)
- translating the complex JSON structures for each request and response payload
- knowing exactly which values are required (and which aren't) depending on the object you're interacting with, what operation you're carrying out, etc
- (including which exact string (and capitalization) to use for values that are really enumerations behind-the-scenes)
Why not just publish an OpenAPI spec?
We did try this in the past, as we liked the idea of generating client libraries using tools like the OpenAPI Generator .
Unfortunately, in our own testing of these tools, we found that:
We could generate code that is free from syntax errors, but
the generated code was not fully functional.
Problems we found:
- The generated code drops significant portions of data from payloads. (Our APIs are payload-rich and endpoint-light, while the generators seem to favor endpoint-rich and payload-light APIs.)
- The various objects the generator creates often make developer consumption cumbersome — long, difficult-to-read object names; many variations of similar objects; and so on.
- The generated code naturally does not include any validation that can't be encoded in the OpenAPI spec. To add this we'd need to wrap the generated code with another layer anyway.
After several attempts at mangling the OpenAPI spec to meet the constraints of the code generator, we eventually decided to go the way we've seen other API-first companies adopt. We found very few API-first companies appear to rely on these code generators, but rather directly maintain their own SDKs for which they may have their own code generators. (Which is in fact exactly what we're doing as well.)
Request for feedback
¶
If you use the raw REST APIs rather than one of the provided SDKs, we would love to understand more. If you can spare a few minutes to fill out our survey , we would be very grateful!
Atlan University
Walk through step-by-step in our intro to custom integration course (30 mins).
Same caveat as with the raw REST APIs
The Postman collection provided still uses the raw REST APIs, and therefore carries the same caveats and warnings.
Ultimately, all pull-based integration mechanisms (including the SDKs) use the REST API; however, the SDKs also encode best practices to avoid the need to understand all these details and low-level nuances. It is not possible to encode all of these best practices into the Postman collection — so if you want to adopt these best practices from the start, we would still strongly recommend directly using either Java or Python, rather than using Postman.
If you really just want to do some initial experimentation directly against the API, you can use Postman .
Check out our Getting started with the APIs article for a walkthrough of setting this up.
But we would still strongly recommend quickly moving to one of the SDKs for integrations or automations.