Part 2: Data Manipulation

At this juncture, you’re acquainted with the swift process of loading data into Keboola, resulting in four new tables in your Storage: account, opportunity, level, and user.

In this segment of the tutorial, we’ll guide you through data manipulation in Storage using transformations. Our objective is to create a denormalized table from the input tables and make some minor modifications to it.

Creating Transformation

  1. To start, navigate to the Keboola Transformations section.

    Screenshot - Transformations Console

  2. Next, click the Create Transformation button and choose Snowflake SQL Transformation (or another SQL transformation, depending on your project’s backend).

    Screenshot - Create Transformation

  3. Enter Denormalize opportunities as its name, and remember to provide a description. Transformations can be organized into folders; you can either add it to a folder during creation or move it to any folder later. Now, enter Opportunity as the folder name.

    Screenshot - Name Transformation

Keboola transformations operate on a few fundamental building blocks. It’s crucial to note that the transformation process occurs in a dedicated database schema, meaning queries are not executed directly against your Storage tables. Instead, the system clones selected tables into the dedicated transformation schema, executes queries, and finally unloads created/modified objects back to the Storage.

  1. Input Mapping: This is where you specify the tables to be used in your transformation. In the default setup, tables not mentioned in Input Mapping cannot be used in the transformation.
  2. Output Mapping: This section deals with tables created or modified within your transformation. Here, you specify the tables that will be written into Storage after the successful execution of the transformation. Tables not mentioned in Output Mapping will neither be modified nor permanently stored; they are considered temporary.