Glen Mazza's Weblog

https://glenmazza.net/blog/date/20180218 Sunday February 18, 2018

Sending Custom Metrics from Spring Boot to Datadog

This tutorial shows how Datadog's API can be used to send custom metrics for a Spring Boot web application and see how the results can be viewed graphically from Datadog dashboards. Samantha Drago's blog post provides a background of Datadog custom metrics which require a paid Datadog account. Note as an alternative not covered here, custom metrics can be defined via JMX with Datadog's JMX Integration used to collect them, this integration in particular provides a list of standard metrics that can be used even with the free DD account.

To facilitate metric accumulation and transferring of metrics to Datadog, Spring Boot's ExportMetricReader and ExportMetricWriter implementations will be used. Every 5 milliseconds by default (adjustable via the spring.metrics.export.delay-millis property), all MetricReader implementations marked @ExportMetricReader will have their values read and written to @ExportMetricWriter-registered MetricWriters. The class ("exporter") that handles this within Spring Boot is the MetricCopyExporter, which treats metrics starting with a "counter." as a counter (a metric that reports deltas on a continually growing statistic, like web hits) and anything else as a gauge (an standalone snapshot value at a certain timepoint, such as JVM heap usage.) Note, however, Datadog apparently does not support "counter" type metric collection using its API (everything is treated as a gauge), I'll be showing at the end how a summation function can be used within Datadog to work around that.

Spring Boot already provides several web metrics that can be sent to Datadog without any explicit need to capture those metrics, in particular, the metrics listed here that start with "counter." or "gauge.". These provide commonly requested statistics such as number of calls to a website and average response time in milliseconds. The example below will report those statistics to Datadog along with application-specific "counter.foo" and "gauge.bar" metrics that are maintained by our application.

  1. Create the web application. For our sample, Steps #1 and #2 of the Spring Boot to Kubernetes tutorial can be followed for this. Ensure you can see "Hello World!" at localhost:8080 before proceeding.

  2. Modify the Spring Boot application to send metrics to Datadog. Note for tutorial brevity I'm condensing the number of classes that might otherwise be used to send metrics to DD. Additions/updates to make:

    • In the project build.gradle, the gson JSON library and Apache HTTP Client libraries need to be added to support the API calls to DD:

      build.gradle:
      dependencies {
      	compile('com.google.code.gson:gson:2.8.2')
      	compile('org.apache.httpcomponents:httpclient:4.5.3')
      	...other libraries...
      }
      
    • The DemoMetricReaderWriter.java needs to be included, it serves as both the reader of our application-specific metrics (not those maintained by Spring Boot--those are handled by BufferMetricReader included within the framework) and as the writer of all metrics (app-specific and Spring Boot) to Datadog. Please see the comments within the code for implementation details.

      DemoMetricReaderWriter.java:
      package com.gmazza.demo;
      
      import com.google.gson.Gson;
      import com.google.gson.GsonBuilder;
      import com.google.gson.JsonPrimitive;
      import com.google.gson.JsonSerializer;
      import org.apache.http.HttpEntity;
      import org.apache.http.StatusLine;
      import org.apache.http.client.methods.CloseableHttpResponse;
      import org.apache.http.client.methods.HttpPost;
      import org.apache.http.entity.ByteArrayEntity;
      import org.apache.http.impl.client.CloseableHttpClient;
      import org.apache.http.impl.client.HttpClients;
      import org.apache.http.util.EntityUtils;
      import org.slf4j.Logger;
      import org.slf4j.LoggerFactory;
      import org.springframework.beans.factory.annotation.Value;
      import org.springframework.boot.actuate.metrics.Metric;
      import org.springframework.boot.actuate.metrics.reader.MetricReader;
      import org.springframework.boot.actuate.metrics.writer.Delta;
      import org.springframework.boot.actuate.metrics.writer.MetricWriter;
      import org.springframework.stereotype.Component;
      
      import javax.annotation.PostConstruct;
      import java.io.Closeable;
      import java.io.IOException;
      import java.math.BigDecimal;
      import java.util.ArrayList;
      import java.util.Arrays;
      import java.util.Date;
      import java.util.HashMap;
      import java.util.List;
      import java.util.Map;
      
      @Component
      public class DemoMetricReaderWriter implements MetricReader, MetricWriter, Closeable {
      
          private static final Logger logger = LoggerFactory.getLogger(DemoMetricReaderWriter.class);
      
          private Metric<Integer> accessCounter = null;
      
          private Map<String, Metric<?>> metricMap = new HashMap<>();
      
          private static final String DATADOG_SERIES_API_URL = "https://app.datadoghq.com/api/v1/series";
      
          @Value("${datadog.api.key}")
          private String apiKey = null;
      
          private CloseableHttpClient httpClient;
      
          private Gson gson;
      
          @PostConstruct
          public void init() {
              httpClient = HttpClients.createDefault();
      
              // removes use of scientific notation, see https://stackoverflow.com/a/18892735
              GsonBuilder gsonBuilder = new GsonBuilder();
              gsonBuilder.registerTypeAdapter(Double.class, (JsonSerializer<Double>) (src, typeOfSrc, context) -> {
                  BigDecimal value = BigDecimal.valueOf(src);
                  return new JsonPrimitive(value);
              });
      
              this.gson = gsonBuilder.create();
          }
      
          @Override
          public void close() throws IOException {
              httpClient.close();
          }
      
          // besides the app-specific metrics defined in the below method, Spring Boot also exports metrics
          // via its BufferMetricReader, for those with the "counter." or "gauge.*" prefix here:
          // https://docs.spring.io/spring-boot/docs/current/reference/html/production-ready-metrics.html
          public void updateMetrics(long barGauge) {
              // Using same timestamp for both metrics, makes it easier to match/compare if desired in Datadog
              Date timestamp = new Date();
      
              logger.info("Updating foo-count and bar-gauge of {} for web call", barGauge);
      
              // Updates to values involve creating new Metrics as they are immutable
      
              // Because this Metric starts with a "counter.", MetricCopyExporter used by Spring Boot will treat this
              // as a counter and not a gauge when reading/writing values.
              accessCounter = new Metric<>("counter.foo",
                      accessCounter == null ? 0 : accessCounter.getValue() + 1, timestamp);
              metricMap.put("counter.foo", accessCounter);
      
              // Does not start with "counter.", therefore a gauge to MetricCopyExporter.
              metricMap.put("gauge.bar", new Metric<>("gauge.bar", barGauge, timestamp));
          }
      
          // required by MetricReader
          @Override
          public Metric<?> findOne(String metricName) {
              logger.info("Calling findOne with name of {}", metricName);
              return metricMap.get(metricName);
          }
      
          // required by MetricReader
          @Override
          public Iterable<Metric<?>> findAll() {
              logger.info("Calling findAll(), size of {}", metricMap.size());
              return metricMap.values();
          }
      
          // required by MetricReader
          @Override
          public long count() {
              logger.info("Requesting metricMap size: {}", metricMap.size());
              return metricMap.size();
          }
      
          // required by CounterWriter (in MetricWriter), used only for counters
          @Override
          public void increment(Delta<?> delta) {
              logger.info("Counter being written: {}: {} at {}", delta.getName(), delta.getValue(), delta.getTimestamp());
              if (apiKey != null) {
                  sendMetricToDatadog(delta, "counter");
              }
          }
      
          // required by CounterWriter (in MetricWriter), but implementation optional (MetricCopyExporter doesn't call)
          @Override
          public void reset(String metricName) {
              // not implemented
          }
      
          // required by GaugeWriter (in MetricWriter), used only for gauges
          @Override
          public void set(Metric<?> value) {
              logger.info("Gauge being written: {}: {} at {}", value.getName(), value.getValue(), value.getTimestamp());
              if (apiKey != null) {
                  sendMetricToDatadog(value, "gauge");
              }
          }
      
          // API to send metrics to DD is defined here:
          // https://docs.datadoghq.com/api/?lang=python#post-time-series-points
          private void sendMetricToDatadog(Metric<?> metric, String metricType) {
              // let's add an app prefix to our values to distinguish from other apps in DD
              String dataDogMetricName = "app.glendemo." + metric.getName();
      
              logger.info("Datadog call for metric: {} value: {}", dataDogMetricName, metric.getValue());
      
              Map<String, Object> data = new HashMap<>();
      
              List<List<Object>> points = new ArrayList<>();
              List<Object> singleMetric = new ArrayList<>();
              singleMetric.add(metric.getTimestamp().getTime() / 1000);
              singleMetric.add(metric.getValue().longValue());
              points.add(singleMetric);
              // additional metrics could be added to points list providing params below are same for them
      
              data.put("metric", dataDogMetricName);
              data.put("type", metricType);
              data.put("points", points);
              // InetAddress.getLocalHost().getHostName() may be accurate for your "host" value.
              data.put("host", "localhost:8080");
      
              // optional, just adding to test
              data.put("tags", Arrays.asList("demotag1", "demotag2"));
      
              List<Map<String, Object>> series = new ArrayList<>();
              series.add(data);
      
              Map<String, Object> data2 = new HashMap<>();
              data2.put("series", series);
      
              try {
                  String urlStr = DATADOG_SERIES_API_URL + "?api_key=" + apiKey;
                  String json = gson.toJson(data2);
                  byte[] jsonBytes = json.getBytes("UTF-8");
      
                  HttpPost httpPost = new HttpPost(urlStr);
                  httpPost.addHeader("Content-type", "application/json");
                  httpPost.setEntity(new ByteArrayEntity(jsonBytes));
      
                  try (CloseableHttpResponse response = httpClient.execute(httpPost)) {
                      StatusLine sl = response.getStatusLine();
                      if (sl != null) {
                          // DD sends 202 (accepted) if it's happy
                          if (sl.getStatusCode() == 202) {
                              HttpEntity responseEntity = response.getEntity();
                              EntityUtils.consume(responseEntity);
                          } else {
                              logger.warn("Problem posting to Datadog: {} {}", sl.getStatusCode(), sl.getReasonPhrase());
                          }
                      } else {
                          logger.warn("Problem posting to Datadog: response status line null");
                      }
                  }
      
              } catch (Exception e) {
                  logger.error(e.getMessage(), e);
              }
          }
      }
      
    • The DemoApplication.java file needs updating to wire in the DemoMetricReaderWriter. It's "Hello World" endpoint is also updated to send a duration gauge value (similar to but smaller than the more complete gauge.response.root Spring Boot metric) to the DemoMetricReaderWriter.

      DemoApplication.java:
      package com.gmazza.demo;
      
      import org.springframework.boot.SpringApplication;
      import org.springframework.boot.actuate.autoconfigure.ExportMetricReader;
      import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
      import org.springframework.boot.autoconfigure.SpringBootApplication;
      import org.springframework.context.annotation.Bean;
      import org.springframework.web.bind.annotation.RequestMapping;
      import org.springframework.web.bind.annotation.RestController;
      
      @SpringBootApplication
      @RestController
      public class DemoApplication {
      
          public static void main(String[] args) {
              SpringApplication.run(DemoApplication.class, args);
          }
      
          private DemoMetricReaderWriter demoMetricReaderWriter = new DemoMetricReaderWriter();
      
          @Bean
          @ExportMetricReader
          @ExportMetricWriter
          DemoMetricReaderWriter getReader() {
              return demoMetricReaderWriter;
          }
      
          @RequestMapping("/")
          String home() throws Exception {
              long start = System.currentTimeMillis();
      
              // insert up to 2 second delay for a wider range of response times
              Thread.sleep((long) (Math.random() * 2000));
      
              // let that delay become the gauge.bar metric value
              long barValue = System.currentTimeMillis() - start;
      
              demoMetricReaderWriter.updateMetrics(barValue);
              return "Hello World!";
          }
      }
      
    • The application.properties in your resources folder is where you provide your Datadog API key as well as some other settings. A few other spring.metrics.export.* settings are also available.

      application.xml:
      # Just logging will occur if api.key not defined
      datadog.api.key=your_api_key_here
      # Datadog can keep per-second metrics, but using every 15 seconds per Datadog's preference
      spring.metrics.export.delay-millis=15000
      # disabling security for this tutorial (don't do in prod), allows seeing all metrics at http://localhost:8080/metrics
      management.security.enabled=false
      
  3. Make several web calls to http://localhost:8080 from a browser to send metrics to Datadog. May also wish to access metrics at .../metrics a few times, you'll note the app-specific metrics counter.foo and gauge.bar become listed in the web page that is returned, also that accessing /metrics sends additional *.metrics (counter.status.200.metrics and gauge.response.metrics) stats to Datadog. We configured the application in application.properties to send Datadog metrics every 15 seconds, if running in your IDE, you can check the application logging in the Console window to see the metrics being sent.

  4. Log into Datadog and view the metrics sent. Two main options from the left-side Datadog menu: Metrics -> Explorer and Dashboards -> New Dashboard. For the former, one can search on the metric names in the Graph: field (see upper illustration below), with charts of the data appearing immediately to the right. For the latter (lower illustration), I selected "New Timeboard" and added three Timeseries and one Query Value for the two main Spring Boot and two application-specific metrics sent.


    Metrics Explorer

    Datadog TimeBoard

    Again, as the "counter" type is presently not supported via the Datadog API, for dashboards the cumulative sum function can be used to have the counter metrics grow over time in charts:

    Cumulative Sum function

Posted by Glen Mazza in Programming at 07:00AM Feb 18, 2018 | Comments[0]

Post a Comment:

Calendar
« April 2023
Sun Mon Tue Wed Thu Fri Sat
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
Today
About Me
Java Software Engineer
TightBlog project maintainer
Arlington, Virginia USA
glen.mazza at pm dot me
GitHub profile for Glen Mazza at Stack Overflow, Q&A for professional and enthusiast programmers
Blog Search


Blog article index
Navigation
About Blog
Blog software: TightBlog 3.7.2
Application Server: Tomcat
Database: MySQL
Hosted on: Linode
SSL Certificate: Let's Encrypt
Installation Instructions