At work we use Domo for analysis of customer behavior. We had a need to query data stored in Domo DataSets, which are its rough equivalent to database tables. While Domo offers their own Java SDK, it hasn't been updated lately, and so I thought it better to create my own client, leveraging what I had done earlier for a Salesforce CRM client, with its Spring OAuth client support. Note the client currently supports just DataSets. Domo offers several other APIs which the client can probably be expanded to support without too much difficulty, leveraging what the DataSets calls do.
The Java client is available in my blog-samples GitHub project. Steps to use:
Follow the Domo documentation to create a client ID and secret, and also to create an access token. In both cases, just the "data" scope is needed. The access token will not be used by the Java client, but will be used next to test permissions.
Confirm the client credentials have the permissions necessary to access the desired DataSets. Use the access token to list the DataSets you have access to and then query one or more. The list call will give you the DataSet ID needed for the subsequent query. If you cannot see the desired DataSet in the list call, or if the query is not returning the DataSet's rows, your account's permissions may need to be adjusted before proceeding further.
Confirm the Java client can work with those credentials. Configure the RestServicesTest in the client's integration tests with your desired DataSet ID and update the query supplied to the DataSetQueryRequest object with actual column names from that DataSet. Also, follow the instructions in the resources/application-test.properties file for adding your client credentials, making sure of course not accidentally move those to a public source control.
Once all is configured, run the test class from either from your IDE debugger or from the command-line (./gradlew clean build integrationTest
) to ensure connectivity working.
The included integration test mostly queries the returned metadata as querying specific data would be dependent on the DataSet that you plugged in. By running the test in an IDE debugger you can see the actual data returned, but a sample format is already given as a comment in the client's DataSetQueryResponse. Note the returned data, given as the List<List<String>> rows
in this class, shows that all data is returned as strings and must be queried via row and column to retrieve:
// for each row of DataSet for (List<String> row : response.getRows()) { // query the row's columns, converting to different types as desired String col1 = row.get(0); int col2 = Integer.parseInt(row.get(1)); ... }
Note the columns are ordered as they were in the query given in the DataSetQueryRequest.
Notes
Client credentials appear to be specific to the user creating them, and are hence limited to the DataSet access permissions that user has. Should the user lose access, another set of client credentials will need to be created.
Domo does not appear to have any direct support for pagination. If you have a very large DataSet, using a WHERE clause in the required SQL statement seems to be the lone option.
Posted by Glen Mazza in Programming at 07:00AM Aug 07, 2023 | Tags: domo | Comments[0]
This entry provides a simple example of using Spring Shell (within Spring Boot) and Micrometer to send custom metrics to Datadog. For a fuller example with Docker and the Datadog Agent, I recommend Datadog Learning Center's free Datadog 101: Developer course. This course will also provide you a free two-week training Datadog account which you can use to receive custom metrics for this example, useful if you don't care to test against your company account. Note custom metrics ordinarily carry billing costs, requiring a paid Datadog account.
Spring Boot already provides numerous web metrics in several areas that can be sent to Datadog without explicit need to capture them. The jvm.*
properties, for example, are readable in Datadog's Metrics Explorer, filtering by statistic:value
for this example in the "from" field. For custom metrics, we'll have the Spring Shell app commands modify a Timer and a Gauge.
Create the Spring Shell application. From Spring Initializr, choose a Java JAR app with Spring Shell and Datadog as dependencies. For some reason I needed to choose the Spring Boot 2.7.x series for an app to download. Prior to running the demo in your IDE (I use IntelliJ), the management.metrics.export.datadog.apiKey=...
value needs to be added to the main/resources/application.properties file. Your API key can be determined by logging into Datadog, and from the bottom of the left-side menu, click on your name, then Organization Settings, then Access, the API Keys.
Create the shell commands to add to the timer and gauge values:
package com.example.demo; import io.micrometer.core.instrument.MeterRegistry; import io.micrometer.core.instrument.Tags; import io.micrometer.core.instrument.Timer; import org.springframework.shell.standard.ShellComponent; import org.springframework.shell.standard.ShellMethod; import java.util.ArrayList; import java.util.List; import java.util.concurrent.TimeUnit; @ShellComponent public class MyCommands { private final Timer timer; private final List<Integer> integerList; public MyCommands(MeterRegistry registry) { timer = registry.timer("demoapp.timer", Tags.empty()); integerList = registry.gauge("demoapp.listsize", Tags.empty(), new ArrayList<>(), List::size); } @ShellMethod("Note a timer event of given duration in seconds") public String timer(int seconds) { timer.record(seconds, TimeUnit.SECONDS); return "Timer event noted"; } @ShellMethod("Add an element to list") public String listAdd() { integerList.add(10); return "List has " + integerList.size() + " elements"; } @ShellMethod("Remove an element from list (if possible)") public String listRemove() { if (integerList.size() > 0) { integerList.remove(integerList.size() - 1); } return "List has " + integerList.size() + " elements"; } }
For the above we're keeping a gauge on the size of the list, and for the timer, we provide the number of seconds that an arbitrary event took.
Run the application and enter several timer #secs
, list-add
, and list-remove
commands. Run -> Run Application from the IntelliJ menu should work fine to enter the commands in the IDE's Console view. To keep the connection with Datadog, keep the command-line app running, even if you're not entering commands. After 2 or 3 minutes, check Datadog's Metrics Explorer to confirm that the demoapp.timer
and demoapp.listsize
metrics are getting received:
A dashboard can be created to show both properties at once (with separate average time and counts given for the Timer):
Resources
Posted by Glen Mazza in Programming at 07:00AM May 20, 2023 | Tags: datadog | Comments[0]
For a Spring Boot application accessing a Flyway-managed MySQL database, I updated its integration tests from using in-memory HSQLDB to Testcontainers' MySQL module. This was both to have testing be done on a database more closely matching deployment environments and also to have various tables pre-populated with standard data provided by our Flyway migration scripts. I'm happy to report that the cost of these benefits was only a slight increase in test running time (perhaps 10-15% more time), and that despite there being 170 Flyway migrations at that time of the conversion.
It is best to use Testcontainers when starting to develop the application, so any hiccups found in a Flyway migration file can be fixed before that migration file becomes final. Switching to testcontainers after-the-fact uncovered problems with some of our migration scripts requiring additional configuration of the MySQL testcontainer. The main problems were unnecessary explicit specification of the schema name in the scripts, and a suboptimal definition of a table that required explicit_defaults_for_timestamp
to be configured in MySQL. This configuration was in our deployment my.cnf
files but not in the default one used by the MySQL testcontainer.
Solving the first issue involved explicit specification of the username, password, and database name when creating the Testcontainers instance. Initializing the MySQL container just once for all integration tests is sufficient for our particular application, so TC's Manual container lifecycle control which uses Java configuration was used:
@SpringBootTest @ActiveProfiles("itest") public class MyAppIntegrationTest { private static final MySQLContainer MY_SQL_CONTAINER; static { MY_SQL_CONTAINER = new MySQLContainer("mysql:5.7") .withUsername("myappUser") .withPassword("myappPass") .withDatabaseName("myapp"); MY_SQL_CONTAINER.start(); } @DynamicPropertySource public static void containersProperties(DynamicPropertyRegistry registry) { registry.add("spring.datasource.username", MY_SQL_CONTAINER::getUsername); registry.add("spring.datasource.password", MY_SQL_CONTAINER::getPassword); registry.add("spring.datasource.url", MY_SQL_CONTAINER::getJdbcUrl); } }
The containersProperties call dynamically populates the given Spring boot config values with those provided from the MySQL testcontainer. Note the spring.datasource.url
will be a "regular" MySQL URL, i.e., Spring Boot and Flyway are unaware that the database is a Testcontainers-provided one.
When doing Java-based instantiation of the MySQL instance as above, I've found adding the standard property file configuration of the MySQL testcontainer to be unnecessary and best avoided -- doing so appeared to create a second, unused, container instance, slowing down builds. However, if you choose to use properties-only Java configuration (perhaps wishing to instantiate and initialize MySQL testcontainers more frequently during the integration tests), configuration similar to the below should work. Note the "myappdatabase", "myuser" and "mypass" values given in the url
property are used to tell Testcontainers the values to set when creating the database and the user. In turn, whatever values placed here should go into the standard Spring username and password properties as shown:
spring.flyway.enabled=true spring.datasource.url=jdbc:tc:mysql:5.7://localhost/myappdatabase?user=myuser&password=mypass spring.datasource.driver-class-name=org.testcontainers.jdbc.ContainerDatabaseDriver spring.datasource.username=myuser spring.datasource.password=mypass
Fixing the second issue involved using a different my.cnf for the MySQL testcontainer. To accomplish that I copied the mysql-default-conf/my.cnf
directory and file from the org.testcontainers.mysql:1.17.6 library (easily viewable from IntelliJ IDEA) and pasted it as src/itest/resources/mysql-default-conf/my.cnf
in the Spring Boot application. From the latter location I added my needed change.
Notes:
The Gradle dependencies used to activate the MySQL testcontainer in the integration test environment:
configurations { integrationTestImplementation.extendsFrom testImplementation integrationTestRuntimeOnly.extendsFrom testRuntimeOnly } dependencies { integrationTestImplementation 'org.springframework.boot:spring-boot-starter-test' integrationTestImplementation 'org.testcontainers:mysql' integrationTestImplementation 'org.testcontainers:junit-jupiter' }
Further resources:
Posted by Glen Mazza in Programming at 07:00AM May 14, 2023 | Comments[0]
I've added support for the Composite API to my Salesforce CRM Java client, in particular for updates and inserts. This API method allows for up to 25 subrequests at the cost of just one API call (with a few other limitations depending on the types of calls made). For insertion-only requests, where the items being inserted are not related to each other, the Multiple Record Insertion technique added earlier to the client is probably best, as it allows for up to 200 insertions in one API call.
For tl;dr; purposes, reviewing the testMultipleEntityRecordInsertionsAndCompositeCalls() test case shows how a Composite API call can be made with this client. The sample makes one request consisting of one insert and two updates, and shows how to obtain the fields returned in both success and error scenarios. An example Multiple Record Insertion technique is in the same test.
As shown in the Salesforce docs, composite calls consist of a series of objects with four fields: method
, url
, referenceId
, and body
:
{ "compositeRequest" : [{ "method" : "POST", "url" : "/services/data/v57.0/sobjects/Account", "referenceId" : "refAccount", "body" : { "Name" : "Sample Account" } },{ "method" : "POST", "url" : "/services/data/v57.0/sobjects/Contact", "referenceId" : "refContact", "body" : { "LastName" : "Sample Contact", "AccountId" : "@{refAccount.id}" } }] }
The method will be POST for inserts and PATCH for updates, while the url
field refers to the object being inserted/updated. Example URL formats for Accounts:
Inserts: /services/data/vXX.X/sobjects/Account
Updates: /services/data/vXX.X/sobjects/Account/id/(SF ref ID of Account)
To provide these fields, the client provides an abstract CompositeEntityRecord
, taking care of all fields but the body, the latter to be provided by subclasses the developer creates.
public abstract class CompositeEntityRecord { @JsonIgnore private final String entity; private final String referenceId; private final Method method; // URL dynamically generated when making request private String url; // getters and setters }
The entity is @JsonIgnored
as it is not part of the JSON object sent to Salesforce. As the URL contains the Salesforce API version and other call-specific data, it will be dynamically generated by the SalesforceCompositeRequestService at the time of the call. A sample CompositeEntityRecord subclass, to update an Account's site and number of employees:
public class AccountUpdateCompositeRecord extends CompositeEntityRecord { private Body body = new Body(); public AccountUpdateCompositeRecord(String referenceId) { super("Account", Method.PATCH, referenceId); } public Body getBody() { return body; } public static class Body { public int numberOfEmployees; public String site; // getters and setters } }
For updates, the id should not be placed in the body, that will instead be placed in the url
field at the time of the service call (below). The referenceId needs to be a unique value for all subrequests of the composite request. For updates, the SF ID of the object being updated (unless you're updating one item multiple times in the same call) would be an excellent fit.
Insertions will normally involve more fields, so it will be usually necessary to create another subclass with its additional fields. Also, there won't be a SF ID yet, so just choose a unique string for each reference ID in the composite call. In the response coming back, use the same ID to obtain the subrequest's results.
public class AccountInsertCompositeRecord extends CompositeEntityRecord { private final Body body = new Body(); public AccountInsertCompositeRecord(String referenceId) { super("Account", Method.POST, referenceId); } public Body getBody() { return body; } public static class Body { .... }
The client provides a CompositeEntityRecordRequest
to hold all the subrequests:
public class CompositeEntityRecordRequest { boolean allOrNone; List<? extends CompositeEntityRecord> compositeRequest; public CompositeEntityRecordRequest(boolean allOrNone) { this.allOrNone = allOrNone; } // getters and setters }
See the Salesforce Docs for the usage of allOrNone, due to its importance it is placed in the constructor to require it to be specified. For the response returned by the Composite API call, the CompositeEntityRecordResponse class below is used. The format of the result body returned from Salesforce is unfortunately different in the success (Map) and failure cases (List of Map), so the Result.body field is declared as an Object. However, there are helper methods getSuccessResultsMap()
and getErrorResultsList()
in the Result object to help you parse the body (see the client test case mentioned above for an example of both). By first reading the Result's httpStatusCode
you can determine the proper method to call.
package net.glenmazza.sfclient.model; import java.util.List; import java.util.Map; public class CompositeEntityRecordResponse { List<Result> compositeResponse; // getters and setters public static class Result { private int httpStatusCode; private String referenceId; private Map<String, String> httpHeaders; private Object body; // getters and setters public Map<String, Object> getSuccessResultsMap() { // ... } public List<Map<String, Object>> getErrorResultsList() { // ... } } }
As shown in the test case, once the CompositeEntityRecordRequest object is created, a call to the client's SalesforceCompositeRequestService is straightforward:
CompositeEntityRecordResponse cerr = scrs.bulkProcess(cerReq);
Further Resources:
How to Use Salesforce Composite Rest API - Roycon website
Posted by Glen Mazza in Salesforce CRM at 07:00AM May 12, 2023 | Comments[0]
Updated May 2023.
I've made available on GitHub a Spring Security-based client library for making OAuth2-enabled REST calls to Salesforce CRM's API. The library supports use of Salesforce's:
SOQL Query: These are handled by the SOQLQueryRunner which provides two options for responses: a JSON-formatted string or a developer-defined parameterized implementation of SOQLQueryResponse (example in the integrated test). The latter takes advantage of the fact that SOQL queries share much common structure and need only a relatively small portion to be overridden to hold fields specific to the SOQL query.
Multiple Record Insertion: See Inserting multiple Salesforce CRM records with a single API call for more details.
Composite API: See Composite API added to Salesforce CRM Java Client for more details.
For authentication, supported are Salesforce's JWT Bearer Token and username/password flows discussed in my earlier blog post. Spring Security's OAuth 2 client are used to obtain access tokens necessary for making these calls.
The integrated test cases give examples of the client in action. As they involve creating, updating, and deleting Salesforce Accounts they should be run against a non-production instance. Salesforce offers free developer instances. Note the test case for the Apex REST functionality will require installing this Apex REST endpoint from the Salesforce documentation. To run the tests, first create an application-test.properties
file in the itest resources folder with the configuration necessary for the flow you are using. There is a template file in that folder specifying what is needed for each OAuth2 flow type. For usage of this library by other applications, this configuration would be placed in the importing application's properties file. The library's SalesforceOAuth2Config class reads that configuration, and will halt on startup with informational messages if anything needed is missing. Once done, the integrated tests can be run from IntelliJ or command-line via ./gradlew integratedTest
.
The Username/Password flow is supported out of the box by Spring, but the JWT bearer token flow requires some extra classes implemented in the client:
What happens when access tokens expire? The WebClient calls have a retry(1) setting that allows for one additional call to the resource server in case of an error such as using an expired access token. In such cases, for the first call, the failure handler in SalesforceOAuth2Config removes the authorized client instance (which has the invalid access token) from memory. For the retry call, SalesforceJwtBearerOAuth2AuthorizedClientProvider notes that there is not an authorized client instance anymore so proceeds to obtain a new access token to allow the second call to proceed. This functionality can be verified by revoking the access token from either Salesforce Setup's Session Management screen or from Connected Apps usage, and confirming that a subsequent resource API call still provides the data. Code breakpoints can also be used to confirm another access token was requested.
Posted by Glen Mazza in Salesforce CRM at 07:00AM May 07, 2023 | Comments[0]
Basit-Mahmood Ahmed has provided a nice example of adding a custom grant to Spring Authorization Server, providing a replacement for the "resource owner" grant removed from the OAuth 2.1 standard. I was able to leverage that for providing our own resource owner implementation. At work we've needed to create several types of custom grants, thankfully what started off as perplexing to implement, due to repetition became rather routine. Best starting advice I can to examine the out-of-the-box provided grant types and follow along with them. The Reference Guide of course and YouTube videos are also valuable, for example Getting Started with Spring Authorization Server and Configuring and Extending Spring Authorization Server.
For each custom grant type to support under Spring Auth Server, I've normally found five extra source files needed, as well as adjusting a couple of others. Most classes are limited in responsibilities helping keep their creation straightforward. Providing links to Basit-Mahmood's example where applicable, as well as some unrelated additional code samples:
The custom grant Token class (example): Extending OAuth2AuthorizationGrantAuthenticationToken, this class holds the properties used by the Provider (discussed below) to authenticate the client. For the resource owner grant, it would have username and password. For a grant based on incoming IP Address, it would be an IP address string. This class is also a good place to define the custom token grant_type parameter used in the OAuth2 token request.
The custom grant Converter class (example): This class takes the incoming HttpServletRequest and, by reading its properties, creates an instance of the Token class. Parameter validation for obvious shortcomings (missing param values, etc.) are good to do here, to help keep the Provider uncluttered.
The Provider class (example): This class takes the Token created by the Converter and authenticates and authorizes the grant. In general, there are two parts to this: authentication of the token, frequently handled by two other classes, discussed below, followed by construction of the JWT, partly in the Provider and partly in the OAuth2TokenCustomizer discussed below.
A token to represent the resource owner. This class will extend from AbstractAuthenticationToken, and will be used both to authenticate a user and to represent the user after authentication.
package ...; import org.springframework.security.authentication.AbstractAuthenticationToken; import org.springframework.security.core.GrantedAuthority; ... public class MyInnerAuthenticationToken extends AbstractAuthenticationToken { private final MyAccessUser myAccessUser; // constructor for user-to-validate public MyInnerAuthenticationToken(String propertyToCheck) { super(null); this.myAccessUser = new MyAccessUser(propertyToCheck); } // constructor for validated user public MyInnerAuthenticationToken(MyAccessUser myAccessUser, Collection extends GrantedAuthority> authorities) { super(authorities); this.myAccessUser = myAccessUser; super.setAuthenticated(true); // must use super, as we override } @Override public Object getPrincipal() { return this.myAccessUser; } @Override public void setAuthenticated(boolean isAuthenticated) throws IllegalArgumentException { Assert.isTrue(!isAuthenticated, "Cannot set this token to trusted - use constructor which takes a GrantedAuthority list instead"); super.setAuthenticated(false); } @Override public String getName() { return this.myAccessUser.getName(); } }
Another Provider to authenticate the above token. This would be called by the OAuth2 grant Provider during authentication. This Provider implements the standard authenticate(Authentication) method, returning a new Token populated with the principal and its authorities.
@Service public class MyInnerAuthenticationProvider implements AuthenticationProvider { private static final Logger LOGGER = LoggerFactory.getLogger(MyInnerAuthenticationProvider.class); @Autowired private MyAuthenticator authenticator; @Override public Authentication authenticate(Authentication authentication) throws AuthenticationException { MyAccessUser unvalidatedUser = (MyAccessUser) authentication.getPrincipal(); String ip = unvalidatedUser.getIpAddress(); MyAccessUser validatedAccessUser = authenticator.checkIpAddress(ip); if (validatedIPAccessUser != null) { Collection<GrantedAuthority> authorities = authenticator.toGrantedAuthorities( validatedAccessUser.getPermissions()); return new MyInnerAuthenticationToken(validatedAccessUser, authorities); } else { LOGGER.warn("Could not validate user {}", unvalidatedUser); return null; } } @Override public boolean supports(Class<?> authentication) { return MyInnerAuthenticationToken.class.isAssignableFrom(authentication); } }
Spring Authorization Server allows for creating an OAuth2TokenCustomizer implementation for adding claims to a JWT common to multiple grant types. It should get picked up automatically by the framework's JwtGenerator. If you've created one, good to review at this stage any adjustments or additions that can be made to it as a result of the new custom grant.
@Component public class MyTokenCustomizer implements OAuth2TokenCustomizer{ public void customize(JwtEncodingContext context) { JwtClaimsSet.Builder claimsBuilder = context.getClaims(); claimsBuilder.claim(ENVIRONMENT_ID, environment); // Spring Auth Server's JwtGenerator does not provide JTI by default claimsBuilder.claim(JwtClaimNames.JTI, UUID.randomUUID().toString()); Authentication token = context.getPrincipal(); // can add principal-specific claims: if (token.getPrincipal() instanceof MySubjectClass chiefJwtSubject) { ... } } }
Once completed, now time to wire new grant support within the authorization server. To wire up the grant-level Converter and Provider, within a WebSecurityConfigurerAdapter subclass:
Listconverters = new ArrayList<>(); converters.add(resourceOwnerPasswordAuthenticationConverter); authorizationServerConfigurer .tokenEndpoint(tokenEndpoint -> { tokenEndpoint.accessTokenRequestConverter(new DelegatingAuthenticationConverter( converters)) // lots more providers .authenticationProvider(resourceOwnerPasswordAuthenticationProvider) } );
The mini-level Provider, used for the actual authentication of the User, can be configured separately as a @Bean:
@Bean public MyInnerAuthenticationProvider myInnerAuthenticationProvider() { return new MyInnerAuthenticationProvider(); }
Once developed, easy to test with Postman. Spring Auth Server uses an oauth2_registered_client table where the client_id and client_secret for clients are defined. Within Postman, Authorization tab, choose Basic Auth type and enter the client ID and secret as the credentials:
Then the new grant type can be tested with a POST call to the standard oauth/token endpoint using that grant_type:
Posted by Glen Mazza in Programming at 07:00AM May 05, 2023 | Comments[0]
Salesforce Marketing Cloud's Automation Studio allows for placing email interaction data into CSV files and having those files sent to an SFTP server for external access. Earlier covered were opens and sends, now I'd like to document extracting clicks and bounces into separate CSV's. As before, objects will be created using SOAP and REST calls, most easily using the Marketing Cloud Postman collection. There will still be a little bit of work needed in Automation Studio, namely plugging in the activities into the automation "shell" created below, also, configuring the scheduling and email notifications for the automation.
This is a three-step automation, with the first being SQL query activities, followed by separate steps for data extracts and file transfers. For this example, the bounces extract does more joins to get more fields than the clicks extract, but the latter can be expanded the same way by adjusting its data extension and SQL query. Note these queries reference the "common" job, subscriber, and list data extensions given in the first script of the sends and opens tutorial, so they will need to be created before the below scripts are used. Further, the SQL query activities that populate the common data extensions should run either in this automation, prior to the given first step, or another that runs before this one, so the common data extensions will be filled and up-to-date.
The first file contains a single SOAP request that creates the clicks and bounces data extensions, SQL query definitions, and the automation shell:
<?xml version="1.0" encoding="UTF-8"?> <s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:a="http://schemas.xmlsoap.org/ws/2004/08/addressing" xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"> <s:Header> <a:Action s:mustUnderstand="1">Create</a:Action> <a:To s:mustUnderstand="1">https://{{et_subdomain}}.soap.marketingcloudapis.com/Service.asmx</a:To> <fueloauth xmlns="http://exacttarget.com">{{dne_etAccessToken}}</fueloauth> </s:Header> <s:Body xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <CreateRequest xmlns="http://exacttarget.com/wsdl/partnerAPI"> <Objects xsi:type="Automation"> <Name>Clicks and Bounces Extracts Automation</Name> <CustomerKey>clicks_bounces_atmn_key</CustomerKey> <Description>Automation to create extracts for bounces and clicks. Run at least 30 mins after the common task as the latter's tables are needed in this task. </Description> <AutomationTasks> <AutomationTask> <Name>Query clicks and bounces data</Name> <Activities> <Activity> <ActivityObject xsi:type="QueryDefinition"> <Name>Replace with cb_s1a_clicks_yesterday_sq</Name> </ActivityObject> </Activity> <Activity> <ActivityObject xsi:type="QueryDefinition"> <Name>Replace with cb_s1b_bounces_yesterday_sq</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> <AutomationTask> <Name>Create files, place in SFMC safehouse</Name> <Activities> <Activity> <ActivityObject xsi:type="DataExtractActivity"> <Name>Replace with cb_s2a_clicks_data_extract</Name> </ActivityObject> </Activity> <Activity> <ActivityObject xsi:type="DataExtractActivity"> <Name>Replace with cb_s2b_bounces_data_extract</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> <AutomationTask> <Name>Move files from safehouse to SFMC SFTP server</Name> <Activities> <Activity> <ActivityObject xsi:type="FileTransferActivity"> <Name>Replace with cb_s3a_clicks_file_transfer</Name> </ActivityObject> </Activity> <Activity> <ActivityObject xsi:type="FileTransferActivity"> <Name>Replace with cb_s3b_bounces_file_transfer</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> </AutomationTasks> <AutomationType>scheduled</AutomationType> </Objects> <Objects xsi:type="ns1:DataExtension" xmlns:ns1="http://exacttarget.com/wsdl/partnerAPI"> <Name>cb_s1a_clicks_yesterday_de</Name> <CustomerKey>cb_s1a_clicks_yesterday_de_key</CustomerKey> <Fields> <Field> <Name>SubscriberID</Name> <IsRequired>true</IsRequired> <FieldType>Number</FieldType> </Field> <Field> <Name>SubscriberKey</Name> <MaxLength>256</MaxLength> <IsRequired>true</IsRequired> <FieldType>Text</FieldType> </Field> <Field> <Name>EmailAddress</Name> <FieldType>EmailAddress</FieldType> </Field> <Field> <Name>EventDate</Name> <IsRequired>true</IsRequired> <FieldType>Date</FieldType> </Field> <Field> <Name>ListID</Name> <IsRequired>true</IsRequired> <FieldType>Number</FieldType> </Field> <Field> <Name>LinkContent</Name> <MaxLength>4000</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>URL</Name> <MaxLength>900</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>LinkName</Name> <MaxLength>1024</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>Domain</Name> <MaxLength>128</MaxLength> <IsRequired>true</IsRequired> <FieldType>Text</FieldType> </Field> <Field> <Name>TriggererSendDefinitionObjectID</Name> <MaxLength>256</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>JobID</Name> <IsRequired>true</IsRequired> <FieldType>Number</FieldType> </Field> <Field> <Name>BatchID</Name> <IsRequired>true</IsRequired> <FieldType>Number</FieldType> </Field> <Field> <Name>TriggeredSendCustomerKey</Name> <MaxLength>36</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>IsUnique</Name> <IsRequired>true</IsRequired> <FieldType>Boolean</FieldType> </Field> <Field> <Name>AccountID</Name> <IsRequired>true</IsRequired> <FieldType>Number</FieldType> </Field> <Field> <Name>OYBAccountID</Name> <FieldType>Number</FieldType> </Field> </Fields> </Objects> <Objects xsi:type="QueryDefinition"> <Name>cb_s1a_clicks_yesterday_sq</Name> <CustomerKey>cb_s1a_clicks_yesterday_sq_key</CustomerKey> <QueryText> select c.LinkContent, c.TriggererSendDefinitionObjectID, c.SubscriberID, c.JobID, c.URL, c.BatchID, c.LinkName, c.Domain, c.ListID, c.EventDate, c.SubscriberKey, sub.EmailAddress, c.TriggeredSendCustomerKey, c.IsUnique, c.AccountID, c.OYBAccountID from _Click c LEFT OUTER JOIN common_subscriber_table_de sub ON c.SubscriberKey = sub.SubscriberKey WHERE c.EventDate >= DATEADD(DAY, -1, CAST(GETDATE() AS DATE)) AND c.EventDate < DATEADD(DAY, 0, CAST(GETDATE() AS DATE)) </QueryText> <TargetType>DE</TargetType> <DataExtensionTarget> <Name>cb_s1a_clicks_yesterday_de</Name> <CustomerKey>cb_s1a_clicks_yesterday_de_key</CustomerKey> </DataExtensionTarget> <TargetUpdateType>Overwrite</TargetUpdateType> </Objects> <Objects xsi:type="ns1:DataExtension" xmlns:ns1="http://exacttarget.com/wsdl/partnerAPI"> <Name>cb_s1b_bounces_yesterday_de</Name> <CustomerKey>cb_s1b_bounces_yesterday_de_key</CustomerKey> <Fields> <Field> <Name>EventType</Name> <MaxLength>10</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>EventDate</Name> <MaxLength>50</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>EmailAddress</Name> <FieldType>EmailAddress</FieldType> </Field> <Field> <Name>SubscriberKey</Name> <MaxLength>320</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>FromName</Name> <MaxLength>250</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>ListName</Name> <MaxLength>254</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>ListID</Name> <FieldType>Number</FieldType> </Field> <Field> <Name>ListType</Name> <MaxLength>20</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>BounceType</Name> <MaxLength>30</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>BounceCategory</Name> <MaxLength>30</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>BounceSubcategory</Name> <MaxLength>30</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>SMTPCode</Name> <FieldType>Number</FieldType> </Field> <Field> <Name>Domain</Name> <MaxLength>250</MaxLength> <FieldType>Text</FieldType> </Field> <Field> <Name>AccountID</Name> <FieldType>Number</FieldType> </Field> <Field> <Name>JobID</Name> <FieldType>Number</FieldType> </Field> <Field> <Name>BatchID</Name> <FieldType>Number</FieldType> </Field> <Field> <Name>EmailID</Name> <FieldType>Number</FieldType> </Field> </Fields> </Objects> <Objects xsi:type="QueryDefinition"> <Name>cb_s1b_bounces_yesterday_sq</Name> <CustomerKey>cb_s1b_bounces_yesterday_sq_key</CustomerKey> <QueryText> SELECT FORMAT(b.EventDate, 'M/dd/yyyy hh:mm:ss tt') AS EventDate, b.BounceType, b.BounceCategory, b.BounceSubcategory, b.SMTPCode, b.Domain, sub.EmailAddress, b.SubscriberKey, b.ListID, lists.ListName, 'List' as listType, {{et_mid}} as AccountID, b.JobID, b.BatchID, 'Bounces' AS EventType, jobs.FromName, jobs.EmailID FROM _Bounce b LEFT OUTER JOIN common_subscriber_table_de sub ON b.SubscriberKey = sub.SubscriberKey LEFT OUTER JOIN common_job_table_de jobs ON b.JobID = jobs.JobID LEFT JOIN common_list_table_de lists ON b.ListID = lists.ListID WHERE b.EventDate >= DATEADD(DAY, -1, CAST(GETDATE() AS DATE)) AND EventDate < DATEADD(DAY, 0, CAST(GETDATE() AS DATE)) </QueryText> <TargetType>DE</TargetType> <DataExtensionTarget> <Name>cb_s1b_bounces_yesterday_de</Name> <CustomerKey>cb_s1b_bounces_yesterday_de_key</CustomerKey> </DataExtensionTarget> <TargetUpdateType>Overwrite</TargetUpdateType> </Objects> </CreateRequest> </s:Body> </s:Envelope>
The second file consists of four separate REST calls, to create the two data extract and two file transfer activities. For the latter, see here on setting the needed fileTransferLocationId
environment variable.
POST calls to https://{{et_subdomain}}.rest.marketingcloudapis.com/automation/v1/dataextracts: { "name": "cb_s2a_clicks_data_extract", "key": "cb_s2a_clicks_data_extract_key", "description": "Data extract for yesterday's clicks", "dataExtractTypeId": "bb94a04d-9632-4623-be47-daabc3f588a6", "fileSpec": "clicks_extract_%%Year%%%%Month%%%%Day%%.csv", "dataFields": [ { "name": "ColumnDelimiter", "type": "string", "value": "," }, { "name": "DECustomerKey", "type": "string", "value": "cb_s1a_clicks_yesterday_de_key" }, { "name": "HasColumnHeaders", "type": "bool", "value": "True" } ] } { "name": "cb_s2b_bounces_data_extract", "key": "cb_s2b_bounces_data_extract_key", "dataExtractTypeId": "bb94a04d-9632-4623-be47-daabc3f588a6", "description": "Data extract for yesterday's bounces", "fileSpec": "bounces_extract_%%Year%%%%Month%%%%Day%%.csv", "dataFields": [ { "name": "ColumnDelimiter", "type": "string", "value": "," }, { "name": "DECustomerKey", "type": "string", "value": "cb_s1b_bounces_yesterday_de_key" }, { "name": "HasColumnHeaders", "type": "bool", "value": "True" } ] } POST calls to https://{{et_subdomain}}.rest.marketingcloudapis.com/automation/v1/filetransfers: (ensure fileTransferLocationId is defined in the SFMC Postman environment variables) { "name": "cb_s3a_clicks_file_transfer", "customerKey": "cb_s3a_clicks_file_transfer_key", "description": "File transfer for clicks data", "fileTransferLocationId": "{{fileTransferLocationId}}", "fileSpec": "clicks_extract_%%Year%%%%Month%%%%Day%%.csv", "isFileSpecLocalized" : true, "isUpload" : true } { "name": "cb_s3b_bounces_file_transfer", "customerKey": "cb_s3b_bounces_file_transfer_key", "description": "File transfer for bounces data", "fileTransferLocationId": "{{fileTransferLocationId}}", "fileSpec": "bounces_extract_%%Year%%%%Month%%%%Day%%.csv", "isFileSpecLocalized" : true, "isUpload" : true }
Once the objects are created, go into Automation Studio and bring up the automation "shell" and plug in the activities as given in that automation. Save and do a "run once", if all looks good, then you're ready to schedule regularly. Again, make sure though that the common data extensions referenced are also getting refreshed so the joins are accurate.
Notes:
MC's SFTP server, used in the file transfers, retains files for only a limited period of time (21 days as of this writing), so download what you will need to keep for a longer period. See the Marketing Cloud Documentation for more information.
For greater depth, Shibu Abraham offers several videos on various Automation Studio activities.
Posted by Glen Mazza in Marketing Cloud at 07:00AM Apr 29, 2023 | Comments[0]
Automations in Marketing Cloud consist of one or more activities placed in one or more steps, with all activities of one step activated simultaneously and with no proceeding to the next step until they have all completed. They also include scheduling and notification information. Activities are defined independently to be plugged into one or more automations, changing the activity's definition holds for all automations it is included in.
For activities, I've earlier provided examples of making SOAP requests to create both SQL query activities and the data extensions they fill, and REST calls to create data extract and file transfer activities. Finally, another entry shows creating an Import activity from a file on SFTP back to a data extension in SFMC. Both articles have seen been updated to include an Automation "shell" as described below.
Even without an API to create the automation holding and ordering these activities, the bulk of the work is done when the activities are created. One still has to figure out the steps and the activities that go into each one, however, requiring documentation for same. Unfortunately, when creating an automation via SOAP request, SFMC does not allow for specifying the needed activities via customer key, something known at the time of creating the automation, and something that remains the same across business units, allowing for reusable SOAP requests. (I've requested them to do so.) Instead, one must specify the (internal) object ID of the activity, which is known only after the objects are created and is different for each business unit.
However, SFMC thankfully meets us halfway. With a create automation SOAP request, one can still define the steps and types of activities needed in each step, and temporarily provide, as the activity name, instructions to the human for the actual activity to manually plug into the automation with Automation Studio. This saves manual documentation of same. The automation's name, customer key, and description can be provided and kept equivalent for all BUs the automation is deployed in.
A sample SOAP request creating an automation "shell" (from one of the two updated blog articles mentioned previously):
<s:Envelope xmlns:s="http://www.w3.org/2003/05/soap-envelope" xmlns:a="http://schemas.xmlsoap.org/ws/2004/08/addressing" xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"> <s:Header> <a:Action s:mustUnderstand="1">Create</a:Action> <a:To s:mustUnderstand="1">https://{{et_subdomain}}.soap.marketingcloudapis.com/Service.asmx</a:To> <fueloauth xmlns="http://exacttarget.com">{{dne_etAccessToken}}</fueloauth> </s:Header> <s:Body xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"> <CreateRequest xmlns="http://exacttarget.com/wsdl/partnerAPI"> <Objects xsi:type="Automation"> <Name>User Agent Info Automation</Name> <CustomerKey>user_agent_atmn_key</CustomerKey> <Description>Obtain user agent info for opens </Description> <AutomationTasks> <AutomationTask> <Name>Create extract for usage agent info</Name> <Activities> <Activity> <ActivityObject xsi:type="DataExtractActivity"> <Name>Replace with s1_user_agent_info_extract</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> <AutomationTask> <Name>File transfer of user agent info to SFTP server</Name> <Activities> <Activity> <ActivityObject xsi:type="FileTransferActivity"> <Name>Replace with s2_user_agent_info_transfer</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> <AutomationTask> <Name>Uncompress into opens.csv on SFTP server</Name> <Activities> <Activity> <ActivityObject xsi:type="FileTransferActivity"> <Name>Replace with s3_user_agent_info_unzip</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> <AutomationTask> <Name>Import user agent info back into data extension</Name> <Activities> <Activity> <ActivityObject xsi:type="ImportDefinition"> <Name>Replace with s4_user_agent_info_import</Name> </ActivityObject> </Activity> </Activities> </AutomationTask> </AutomationTasks> <AutomationType>scheduled</AutomationType> </Objects> </CreateRequest> </s:Body> </s:Envelope>
Once the shell and the activities have been created via API, it's a rather simple matter to hit "Replace" on each of the steps making up the automation and choose the activity reported on the step, and then test and schedule the automation as usual.
The REST endpoints for automations I believe fall into the "undocumented" (i.e., unsupported) category. In attempting to create an automation via REST, I saw the same shortcomings as this 2019 Salesforce Stack Exchange posting. However, as shown in that posting, retrieving an automation feature provides some value, showing the scheduling information as well as row counts in the data extensions that the automation's SQL activities use.
Notes
To obtain the ObjectID given an automation's customer key, one needs to request the ProgramID
property, requesting the usual ObjectID
does not work properly:
<RetrieveRequestMsg xmlns="http://exacttarget.com/wsdl/partnerAPI"> <RetrieveRequest> <ObjectType>Automation</ObjectType> <Properties>ProgramID</Properties> <Properties>Name</Properties> <Properties>Description</Properties> <Properties>CustomerKey</Properties> <Properties>IsActive</Properties> <Filter xsi:type="SimpleFilterPart"> <Property>CustomerKey</Property> <SimpleOperator>equals</SimpleOperator> <Value>user_agent_atmn_key</Value> </Filter> </RetrieveRequest> </RetrieveRequestMsg>
To run, above snippet can be put within the body of the SOAP request given for the automation above.
Once you obtain the Object ID for an automation, you can further drill down to obtain information on the steps of an automation and activities with the steps, by querying the Task
and Activity
objects respectively, with this filter:
<Filter xsi:type="SimpleFilterPart"> <Property>Program.ObjectID</Property> <SimpleOperator>equals</SimpleOperator> <Value>...uuid of automation...</Value> </Filter>
To find properties you can request for an object, including the Automation
, Task
, and Activity
objects, make a DefinitionRequestMsg call and in the response check for properties with an isRetrievable
value of true
.
Posted by Glen Mazza in Marketing Cloud at 07:00AM Apr 26, 2023 | Comments[0]