DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workkloads.

Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Essential Guidelines for Building Optimized ETL Data Pipelines in the Cloud With Azure Data Factory
  • DevOps Nirvana: Mastering the Azure Pipeline To Unleash Agility
  • Azure DevOps Pipeline for Oracle Integration Cloud
  • Blueprint for Migrating On-Premise Data Pipelines To Azure Cloud

Trending

  • Build an MCP Server Using Go to Connect AI Agents With Databases
  • Segmentation Violation and How Rust Helps Overcome It
  • Optimize Deployment Pipelines for Speed, Security and Seamless Automation
  • A Complete Guide to Modern AI Developer Tools
  1. DZone
  2. Testing, Deployment, and Maintenance
  3. DevOps and CI/CD
  4. Passing JSON Variables in Azure Pipelines

Passing JSON Variables in Azure Pipelines

Learn how to handle JSON variables in Azure DevOps pipelines, avoid escaping issues, and ensure seamless API integration with proper normalization techniques.

By 
Mohammed Basil user avatar
Mohammed Basil
·
Jan. 29, 25 · Tutorial
Likes (2)
Comment
Save
Tweet
Share
4.9K Views

Join the DZone community and get the full member experience.

Join For Free

When working with Azure DevOps pipelines, there are situations where you need to use JSON as a variable — whether it's for dynamically passing configurations, triggering APIs, or embedding JSON data into scripts. A common use case is creating a pipeline that triggers an API requiring a JSON payload.

However, Azure DevOps treats all variables as plain strings, and when you attempt to pass JSON, it often results in malformed data due to improper escaping. This can break APIs or other components expecting valid JSON.

Why JSON Variables Fail in Azure Pipelines

  • Azure DevOps pipeline variables are treated as plain strings.
  • When attempting to pass JSON, double quotes (") and backslashes (\) in the JSON are often misinterpreted, leading to malformed data.

Example of the Issue

A pipeline variable is defined as:

JSON
 
{
  "source": {
    "type": "filesystem",
    "location": "test-location",
    "filePath": "data.csv"
  }
}


When used in a script, it might get transformed into:

JSON
 
{\\"source\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"test-location\\",\\"filePath\\":\\"data.csv\\"}}


This results in broken APIs or unexpected behavior. 

Why It Doesn't Work

  • Double-escaping: Azure DevOps escapes quotes and backslashes automatically when passing variables.
  • API expectation mismatch: APIs expect JSON in a valid format, but the pipeline delivers an improperly escaped version.

To use JSON variables in Azure pipelines, the key is to normalize the JSON and escape it correctly. Here's how you can solve the problem step by step. 

Step 1: Define a JSON Variable in the Azure Pipeline UI

In the pipeline UI, add a variable named json_data with the following value (dummy data):

JSON
 
{
  "source": {
    "type": "filesystem",
    "location": "test-location",
    "filePath": "data.csv"
  },
  "sink": {
    "type": "filesystem",
    "location": "output-location",
    "filePath": "processed-data.csv"
  }
}


When stored in Azure DevOps, this JSON will be automatically escaped, resulting in something like: 

JSON
 
{\\"source\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"test-location\\",\\"filePath\\":\\"data.csv\\"},\\"sink\\":{\\"type\\":\\"filesystem\\",\\"location\\":\\"output-location\\",\\"filePath\\":\\"processed-data.csv\\"}}


Step 2: Process the Variable in Your YAML Script

Here’s how to normalize the escaped JSON in your YAML file:

YAML
 
trigger:
  branches:
    include:
      - main

jobs:
- job: HandleJsonVariable
  pool:
    vmImage: 'ubuntu-latest'
  steps:
  - script: |
      # Step 1: Retrieve the raw JSON from the pipeline variable
      raw_json=$(echo "$(json_data)")

      # Step 2: Normalize the escaped JSON by replacing \\ with \
      normalized_json=$(echo "$raw_json" | sed 's/\\\\/\\/g')

      # Step 3: Debug normalized JSON
      echo "Normalized JSON:"
      echo "$normalized_json" | jq .

      # Step 4: Escape quotes for specific use cases (e.g., if embedding JSON in another JSON payload)
      escaped_json=$(echo "$normalized_json" | sed 's/"/\\"/g')

      # Debug escaped JSON
      echo "Escaped JSON:"
      echo "\"$escaped_json\""

      # Step 5: Construct a general JSON payload dynamically
      final_payload=$(cat <<EOF
      {
        "example_key": "example_value",
        "payload_data": "$escaped_json"
      }
      EOF
      )

      # Step 6: Debug the final payload
      echo "Final Payload:"
      echo "$final_payload" | jq .

      # Step 7: Simulate sending the payload to an external API (replace with actual API details if needed)
      echo "Simulating API Call..."
      response=$(curl -s -X POST https://example-api.com/endpoint \
      -H "Content-Type: application/json" \
      -d "$final_payload")

      # Step 8: Debug the API response
      echo "API Response:"
      echo "$response" | jq .

    displayName: 'Handle and Process JSON Variable'


Retrieve the Variable

Fetch the json_data variable using $(json_data).

Normalize the JSON

Replace \\ with \ to correct the double-escaping.

YAML
 
normalized_json=$(echo "$raw_json" | sed 's/\\\\/\\/g')


Escape Quotes (Optional)

If the API requires JSON as a string within another JSON, escape quotes:

JSON
 
escaped_json=$(echo "$normalized_json" | sed 's/"/\\"/g')


Build the Payload

Construct the JSON payload dynamically using a cat <<EOF block. 

Debug Output

Use echo and jq to validate each transformation step.

Key Takeaways

  • Azure variables are strings: JSON must be processed before use.
  • Normalize and escape as needed: Properly escape quotes and backslashes.
  • Debugging is essential: Use tools like jq to validate JSON at every step.

Passing JSON variables in Azure DevOps pipelines can be challenging, but with the right approach, it's manageable. By normalizing and escaping JSON, you can ensure your pipeline works seamlessly with APIs or other JSON-consuming components.

JSON azure Pipeline (software)

Opinions expressed by DZone contributors are their own.

Related

  • Essential Guidelines for Building Optimized ETL Data Pipelines in the Cloud With Azure Data Factory
  • DevOps Nirvana: Mastering the Azure Pipeline To Unleash Agility
  • Azure DevOps Pipeline for Oracle Integration Cloud
  • Blueprint for Migrating On-Premise Data Pipelines To Azure Cloud

Partner Resources

×

Comments

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • [email protected]

Let's be friends: