Understanding the Prometheus Alertmanager Configuration File

Provides an overview of the key components of the Prometheus Alertmanager configuration file, field descriptions, and configuration examples.

The alertmanager.yaml file stores Alertmanager configurations. Configurations in this file specify how Alertmanager routes and delivers the alerts received from Prometheus. You can configure the alertmanager.yaml file to send SMTP or Slack notifications.

To learn about acessing and configuring the alertmanager.yaml file to send notifications, see Configuring Alert Forwarding and Configuring Templates and Filtering Alerts.

The following code block shows the sample alertmanager.yaml file:
global:
  # SMTP configuration for email notifications (if needed)
  smtp_smarthost: 'mailserver.example.com:587'
  smtp_from: 'alertmanager@example.com'
  smtp_auth_username: 'alertmanager'
  smtp_auth_password: 'your_password'

  # Other global settings like resolve_timeout, http_config, etc.

route: 
  # Default receiver for alerts
  receiver: 'default-receiver'

  # Labels used for grouping alerts 
  group_by: ['alertname', 'instance', 'severity'] 

  # Timing settings (group_wait, group_interval, repeat_interval) 
  # You can have nested 'routes' for more complex routing logic 

receivers:
- name: 'default-receiver'
  email_configs:
    to: 'ops-team@example.com' 
    # ... other email settings ...

- name: 'slack-notifications'
  slack_configs:
    api_url: 'https://hooks.slack.com/services/YOUR/SLACK/WEBHOOK'
    channel: '#alerts'
    # ... other Slack settings ...

- name: 'pagerduty-notifications'
  pagerduty_configs:
    service_key: 'your_pagerduty_service_key'
    # ... other PagerDuty settings ...

# ... more receivers as needed (webhooks, OpsGenie, etc.)

inhibit_rules:
  # Rules to suppress alerts based on other alerts
  # Example:
  - source_match:
      severity: 'critical'
    target_match:
      severity: 'warning'
    # Suppress 'warning' alerts if a 'critical' alert is also firing

templates: 
  # Paths to template files for customizing notifications 
  - '/etc/alertmanager/templates/*.tmpl' 

Field Descriptions of the alertmanager.yaml File

The following table describe fields of the alertmanager.yaml file:

Fields Descriptions
global:

Defines general settings for Alertmanager.

resolve_timeout:

Specifies the time to wait for alerts to be acknowledged as resolved. For example, e.g., resolve_timeout: 5m)

smtp_smarthost:

Address of the SMTP server. For example, smtp_smarthost:'mail.example.com:25'.

smtp_from:

Specifies the email address of sender.

smtp_auth_username:

Username for SMTP authentication.

smtp_auth_password:

Password for SMTP authentication.

http_config:

(Optional) For configuring TLS, authentication, and other in Alertmanager web interface.

receivers:

Defines how alerts are received. You can configure multiple receivers for different notification methods such as email, Slack, and others.

name:

Specifies a descriptive name for the receiver configuration. This is a unique name for your notification channel. This name is also used for routing alerts.

email_configs:

Specifies configurations for the email notifications.

to:

Specifies the comma-separated list of email addresses to receive alerts.

from:

Specifies the email address from which notifications are sent.

smarthost:

Specifies the hostname and port of your SMTP server. For example, smtp.example.com:587

auth_username:

Username for SMTP authentication.

auth_password:

Password for SMTP authentication.

html:

(Optional) Format email body as HTML.

require_tls:

(Optional) Enforce TLS for sending email.

slack_configs:

Specifies configurations for the Slack notifications.

channel:

Specifies the Slack channel to receive alerts.

api_url:

The webhook URL generated from your Slack Incoming Webhook integration.

title:

Specifies the title of the Slack notifications.

text:

Specifies the descriptive text for the Slack notifications.

webhook_configs:

Specifies configurations for a general receiver.

url:

Specifies the endpoint URL to send notifications.

http_config:

(Optional) Specifies configurations for HTTP authentication, proxies, and others.

pagerduty_configs:

Specifies configurations for sending the PagerDuty notifications.

service_key:

Specifies the PagerDuty integration key.

route:

(Optional) Specifies advanced routing rules based on alert characteristics (for example, severity labels). You can specify which receivers receive specific types of alerts.

group_by:

List of labels to groups similar alerts together. For example, ['alertname', 'cluster', 'service'].

group_wait:

Specifies the time to wait before sending the initial notification. For example, 30s.

group_interval:

Specifies the time between sending the grouped notifications. For example, 5m.

repeat_interval:

Specifies time between repeat notifications for unresolved alerts. For example, 3h.

routes:

(Optional) For setting the individual routing rules for different alert types.

match:

Filters alerts by matching a specific label to a value. For example, filter by severity: critical.

match_re:

Filters alerts by applying a regular expression to match a label for more advanced matching.

receiver:

Specifies the name of the previously defined receiver to determine the destination for the alert notifications.

Configuring Basic Email Alerts

The following example shows the basic email alerting configurations.
global:
  smtp_smarthost: 'smtp.example.com:587'
  smtp_from: 'alertmanager@example.com'
  smtp_auth_username: 'alertmanager'
  smtp_auth_password: 'your_password'

route:
  receiver: 'email-alerts'

receivers:
- name: 'email-alerts'
  email_configs:
    to: 'team@example.com'

This configuration sends all alerts to the specified email addresses. To customize,

  • Modify the SMTP settings with your email server details such as, hostname, port, credentials.
  • Set to: under receivers: with the email address where you want to receive alerts.

Configuring Slack Notifications

The following example shows the Slack notifications configurations.
route:
  receiver: 'slack-alerts'

receivers:
- name: 'slack-alerts'
  slack_configs:
    api_url: 'https://hooks.slack.com/services/YOUR/SLACK/WEBHOOK'
    channel: '#alerts'
    text: "Firing: {{ .CommonAnnotations.summary }}" 

This configuration sends all alerts to the specified Slack channel. To customize,

  • Set api_url: with your Slack webhook URL.
  • Set channel: with your target channel.
  • Modify the text: property with the alert message using Go templating.

Configuring Multiple Receivers with Routing

The following example shows configurations for multiple receivers using routing.
route: 
  group_by: ['alertname', 'severity'] 
  receiver: 'default-receivers' 
  routes:
    - match:
        severity: critical
      receiver: 'pagerduty-notifications' 

receivers:
- name: 'default-receivers'
  email_configs: 
    to: 'team@example.com'
- name: 'pagerduty-notifications'
  pagerduty_configs:
    service_key: 'your_pagerduty_service_key'

This configuration routes the critical alerts to PagerDuty and all the other alerts to the email address. To customize,

  • Set service_key: with the actual integration key.
  • Modify email_configs: as required.
  • Configure the routing rules based on your alert labels.

Configuring Alerts Silencing

The following example sends alerts to both email and Slack.
route:
  receiver: 'team-alerts'

receivers:
- name: 'team-alerts'
  email_configs:
    to: 'team@example.com'
  slack_configs:
    api_url: 'https://hooks.slack.com/services/YOUR/SLACK/WEBHOOK'
    channel: '#alerts'

# Silence alerts during scheduled maintenance windows
inhibit_rules:
  - source_match:
      severity: 'warning'
    target_match:
      job: 'kubeflow'
    equal: ['maintenance'] 

This configuration suppresses alerts of type warning for the database job if they have the label maintenance=true. This is useful for avoiding unwanted noise during the planned maintenance period. To customize,

  • Adjust the source_match: and target_match: sections to target specific alerts for silencing.
  • Use different labels and values to match your alerting setup.

Configuring Inhibition Rules

The following example shows configurations for suppressing related alerts.
route:
  receiver: 'team-alerts'
  # ... receiver definitions for email, Slack, etc. ...

inhibit_rules:
  # Suppress node down alerts if the cluster is down
  - source_match:
      job: 'node'
      severity: 'critical'
    target_match:
      job: 'cluster'
      severity: 'critical'

This configuration prevents the node down alerts from being sent when the broader cluster down alerts are already being sent. You must carefully customize inhibition rules to avoid missing important alerts. To customize,

  • Modify the source_match: and target_match: to specify alerts that suppress others.

Configuring Routing Rules

The following examples describe two routing rules of Alertmanager.
route:
  group_by: ['alertname']  # Example: Group alerts with the same name

  routes:
    # Always send critical alerts regardless of time
    - match:
        severity: critical
      receiver: 'critical-alerts'  # Send critical alerts to this receiver

    # Weekend silence for non-critical alerts (email & Slack)
    - match:
        severity: NOT critical  # All non-critical alerts
      receiver: 'weekend-silence'  # This receiver won't send alerts on weekends
        # Define the time condition for weekends using weekday number (0=Sunday)
        mute_intervals:  
          - hours: 12-23  # Friday evening silence from 12 PM onwards
          - days:        # Saturday silence all day
              - 6
          - hours: 0-11  # Sunday morning silence until 11 AM

In this example, the critical-alerts and weekend-silence receivers are pre-configured in your alertmanager.yaml file with details specifying how they send notifications. You can include additional notification channels in the weekend-silence receiver, such as SMS, which might be necessary during weekends.

This example uses weekday numbers (0=Sunday) for the weekend schedule, allowing you to modify this number as needed.

Always send critical alerts

This configuration ensures that critical issues receive immediate attention regardless of the day or time.

This rule matches alerts with severity: critical and sends them to the critical-alerts receiver.

Silence alerts on weekends

This configuration disables the email and Slack notifications during weekends.

This rule matches all alerts that are NOT critical and sends them to the weekend-silence receiver. However, this receiver also includes a mute_intervals section, which is defined to silence notifications during specific times as follows:
  • Friday evenings from 12 PM onwards (hours: 12-23).
  • All day Saturday (days: [6]).
  • Sunday mornings until 11 AM (hours: 0-11).