Setup your environment: In order to deploy DataGraph, you will need to set up some assets in Exchange and deploy some APIs that we will leverage within DataGraph.
Publish specifications in Design Center
Create assets in API Manager
Deploy APIs in Runtime Manager
DataGraph Setup
Add Orders API to unified schema
Add Shipment API to unified schema
Add Sales API to unified schema
What you'll learn
How to create a unified schema
How to merge types
How to link types
What you'll need
Anypoint Platform account with DataGraph entitlement
Anypoint Platform account with API Manager entitlement
At least .3 vCores of capacity available in CloudHub
In order to import APIs into DataGraph, you will need to publish their specifications (either RAML or OAS) to Exchange. For this lab, we will be publishing the Orders, Shipment, and Sales APIs in order to build our Unified Schema. Download the assets below.
After downloading the RAML spec zips, navigate to the Exchange for the Business Group you will be using for this lab.
Import Orders API
Click Publish new asset in the top left.
For asset name enter Orders API
For asset type, select REST API in the drop down
Check Upload a RAML
Click Publish
Repeat these steps for Shipment API and Sales API
After publishing all three APIs, your Exchange assets should look like the following:
In order to manage the APIs, we will need to create API Manager entries for each. We will need to save the autodiscovery IDs for deployment later. I suggest opening some kind of text editor and pasting the following in to help keep track of these values:
In the API name field, start typing Orders API and select the API when it pops up as a dropdown.
Click Save
After creating the API, log the auto discovery ID in your text editor
Repeat this process for the Shipment API and Sales API
After completing the setup process, you should see the following entries in API Manager:
Now that we have created the API assets in Exchange and API Manager, we are ready to deploy the applications. Before deploying, we will need the Environment Client ID and Secret in order to allow the apps to register with API Manager.
Navigate to Access Management
Click on Environments
Click on the text of the environment you will be deploying into
Copy the Client ID and Secret into your text editor
Set the application name to datagraph-orders-api-{SOMETHING UNIQUE LIKE YOUR INITIALS}
In the Application File box, click Choose file and select Upload file
Select the Orders API JAR file
Click on the Properties tab
Add a property called api.autodiscovery.id and set the value to the ID we created earlier
Add a property called anypoint.platform.client_id and set the value to the Environment Client ID we copied earlier
Add a property called anypoint.platform.client_secret and set the value to the Environment Client Secret we copied earlier
Add a property called app.client_secret and hardcode the value to 1111
Add a property called app.client_id and hardcode the value to 1111
Click Deploy Application
After deployment finishes, navigate back to runtime manager and click on the application
Copy the App url
Navigate to API Manager
Click on the text of the Orders API entry in order to edit the entry.
Paste the URL you copied into the Consumer endpoint field, adding /api to the end.
Repeat this process for the Shipment API and Sales API
With our APIs deployed and managed, we are finally ready to get started with DataGraph! This is the easy part! This scenario will walk you through the entire process of using our now existing APIs published in Exchange (with their corresponding Managed Instances) to first build the DataGraph Unified Schema and then execute GraphQL queries. This includes:
Adding required APIs to DataGraph
Adapting the data model to our needs
Running GraphQL queries
Navigate to DataGraph
Click Add API
Select datagraph-order-api from the list of APIs in Exchange and click on Next: Configure URL
Confirm the version of the API you want to add to DataGraph and click on Confirm selection
By default, you should see the API instance you set up earlier automatically selected. You should see your unique CloudHub URL.
Click on Next: Provide authentication
By default, No Auth is selected. Explore the options available; select No Auth when done
Click Next: Preview Schema button in the popup. Here you can view the Schema pulled from the API specification
Click Next: Edit Schema
At this point, it is important to notice that Nested Types are hidden by default. Let's enable them so that developers can access this data if they need it!
Click on each Nested Type and change the Desired State to Visible
We have one additional step required. In order to allow other types in the local API schema or Unified Schema to Link or Merge with this type and extend the datasets, you must enable collaboration. Collaboration empowers developers to create a robust Unified Schema. Click on the Order type on the left hand side of the panel and click on Enable Collaboration
When you click on Enable Collaboration you'll be prompted to define the default query and the primary key. Make sure you select orderId as the primary key.
Once confirmed, you'll have the following information for the Order type regarding Collaboration Permissions
Finally, click Next: Add to Unified Schema to proceed. The following message appears; just click Proceed to Unified Schema.
Following the same steps used previously, add the datagraph-shipment-api to the Unified Schema.
For this new API, you'll have a couple of Nested Types (Carrier, Status) as well. Make them visible just in case anyone wants to run queries which include this information. Be sure to enable collaboration for the Shipment type.
Click Next: Add to unified schema to finalize.
Add the datagraph-sales-api following the same steps you used for the previous APIs.
At this point, you should see that we have a conflict: there is already another type in the Unified Schema with the name Order. We will need to resolve this conflict before proceeding.
To resolve this issue, we can Rename this new Order type or Merge with the current type. We know that in this case, these are the same objects; we want to extend the existing Order type with the fields coming from this new object. In order to do this, let's merge the types!
Click on the merge link in the error message; this should take you down to the merge section on the same screen. If it doesn't, simply scroll down.
DataGraph automatically recommends merging this Order type from the Sales API with the Order type already in the Unified Schema. Select Order type from the dropdown list to merge with the new Order type being imported.
At this point, you have to choose how you want to merge these structures. Click See an example for each merge type to see the potential structures. In this case, we want to have an Order type that contains data from the original type and the new type we're adding, so we will select the first type: Add the fields from the two...
Before we can finalize these settings, we have to Enable Collaboration - a requirement whenever merging or linking types. By now this process should be a breeze for you! Make sure you choose orderId as your primary key.
Before finalizing, click Preview merge result in order to get a preview of the final merge result
In this preview, you can see how the merge has been configured. Let's pay attention to the list of fields, because it is important to notice that with a single query we'll get details of the Order coming from two different APIs. Notice that the field customerEmail is not coming from the new Order type because there already is a field with the same name in the existing Order type.
Click Confirm merge
Now that we've cleared up the conflicts, we can proceed!
We could finish this API up here, adding this merged Order type to the Unified Schema. Before we do that though, we will explore some additional capabilities of DataGraph by making some changes to the schema. DataGraph gives us the ability to both hide and unhide individual fields, as well as rename fields. Let's change the field named origination to location.
Click the Rename field button to the right of the field origination
Set the name to location
Click Confirm
Next, go ahead and make the remaining nested types visible. This should be pretty easy for you at this point!
We still aren't done! We've seen how to merge types, but what about linking types? We now have a type, Delivery, which we can use to link to the Shipment API. Let's go ahead and create that link.
Click on the Delivery type
Scroll down to the Link section
Select the type you want to link to: Shipment
Select the ID we will use when linking; this ID comes from the delivery type and is shipmentId
Click Save changes
After linking shipment in Delivery type with the existing Shipment type in the Unified Schema, you'll see the following in the Delivery type configuration
Finalize by clicking Next: Add to unified schema
Click the Run Query Button in the top right.
The Unified Schema Endpoint is automatically secured using client_id enforcement policy, so you need to request access to get any data. When clicking Run Query, you will automatically be prompted to either select an existing application or create a new one. Let's choose to create a new application.
Once the credentials for the newly created application are validated, you'll see the query builder. Run the following query:
{
ordersById (id: "122344") {
status
customerEmail
}
}
Let's try a more complicated query that gets details from all three APIs!
{
ordersById (id: "122344") {
status
customerEmail
status
totalAmount
product {
productNumber
name
unit
unitPrice
}
deliveryInfo {
shipment {
carrier
expectedDeliveryDate
trackingNumber
}
deliveryAddress {
city
postalCode
}
}
}
}
Query builder includes capabilities beyond simply running queries! The tool empowers you to test that the Unified Schema is working properly, and that the downstream APIs are performing correctly and properly configured.
Let's click the track query button again and run the same query from the end of the previous section:
As you can see above, we are able to track the calls being made by this query, and the response times of the downstream sources. This is a powerful tool providing you the ability to track down issues with responsiveness and see a high level profile of your query.
Next, click the View Response Logs
By default, the logs open filtered to your most recent query:
This provides a quick and easy view into exactly what is going on within the DataGraph runtime.