Pular para o conteúdo principal
Página não listada
Esta página não está listada. Mecanismos de busca não armazenarão nenhuma informação, e somente usuários que possuam o link direto poderão acessá-la

ALTER STREAMING TABLE

Preview

This feature is in Public Preview.

Applies to: check marked yes Databricks SQL

Allows you to perform any of the following actions:

  • Add a schedule or trigger for refreshing an existing streaming table.
  • Alter an existing refresh schedule or trigger for a streaming table.
  • Drop the refresh schedule or trigger for a streaming table. If there is no schedule or trigger, the object needs to be refreshed manually to reflect the latest data.

To add or alter a comment on a streaming table, use COMMENT ON.

nota

Altering a pipeline-created dataset in ways that contradict the defining SQL can cause some changes to be reverted. See Using ALTER commands with Lakeflow Spark Declarative Pipelines.

Syntax

ALTER STREAMING TABLE table_name
{
{ ADD | ALTER } schedule
DROP SCHEDULE |
ALTER COLUMN column_clause |
SET ROW FILTER clause |
DROP ROW FILTER |
SET TAGS clause |
UNSET TAGS clause |
}

schedule
{ SCHEDULE [ REFRESH ] schedule_clause |
TRIGGER ON UPDATE [ AT MOST EVERY trigger_interval ] }

schedule_clause
{ EVERY number { HOUR | HOURS | DAY | DAYS | WEEK | WEEKS } |
CRON cron_string [ AT TIME ZONE timezone_id ] }

column_clause
column_identifier
{ COMMENT clause |
SET MASK clause |
DROP MASK |
SET TAGS clause |
UNSET TAGS clause }

Parameters

  • table_name

    The name of the streaming table to alter the definition of. The name must not include a temporal specification.

  • schedule

    Add or alter a SCHEDULE or TRIGGER statement on the table.

    nota

    You can't modify the schedule of a streaming table created with Lakeflow Spark Declarative Pipelines with this command. Use the pipeline editor.

    • SCHEDULE [ REFRESH ] schedule_clause

      • EVERY number { HOUR | HOURS | DAY | DAYS | WEEK | WEEKS }

        To schedule a refresh that occurs periodically, use EVERY syntax. If EVERY syntax is specified, the streaming table or materialized view is refreshed periodically at the specified interval based on the provided value, such as HOUR, HOURS, DAY, DAYS, WEEK, or WEEKS. The following table lists accepted integer values for number.

        Time unit

        Integer value

        HOUR or HOURS

        1 <= H <= 72

        DAY or DAYS

        1 <= D <= 31

        WEEK or WEEKS

        1 <= W <= 8

        nota

        The singular and plural forms of the included time unit are semantically equivalent.

      • CRON cron_string [ AT TIME ZONE timezone_id ]

        To schedule a refresh using a quartz cron value. Valid time_zone_values are accepted. AT TIME ZONE LOCAL is not supported.

        If AT TIME ZONE is absent, the session time zone is used. If AT TIME ZONE is absent and the session time zone is not set, an error is thrown. SCHEDULE is semantically equivalent to SCHEDULE REFRESH.

    • TRIGGER ON UPDATE [ AT MOST EVERY trigger_interval ]

      Beta

      The TRIGGER ON UPDATE feature is in Beta.

      Sets the table to refresh when an upstream data source is updated, at most once every minute. Set a value for AT MOST EVERY to require at least a minimum time between refreshes.

      The upstream data sources must be either external or managed Delta tables (including materialized views or streaming tables), or managed views whose dependencies are limited to supported table types.

      Enabling file events can make triggers more performant, and increases some of the limits on trigger updates.

      The trigger_interval is an INTERVAL statement that is at least 1 minute.

      TRIGGER ON UPDATE has the following limitations

      • No more than 10 upstream data sources per streaming table when using TRIGGER ON UPDATE.
      • Maximum of 1000 streaming tables or materialized views can be specified with TRIGGER ON UPDATE.
      • The AT MOST EVERY clause defaults to 1 minute, and cannot be less than 1 minute.
  • DROP SCHEDULE

    Removes a schedule from the streaming table.

  • ALTER COLUMN clause

    See ALTER COLUMN clause.

  • SET ROW FILTER clause

    Preview

    This feature is in Public Preview.

    Adds a row filter function to the streaming table. All subsequent queries to the streaming table receive a subset of the rows where the function evaluates to boolean TRUE. This can be useful for fine-grained access control purposes where the function can inspect the identity or group memberships of the invoking user to determine whether to filter certain rows.

    Row filters added after creation only propagate to downstream tables after the next update. For continuous pipelines, this requires a pipeline restart.

  • DROP ROW FILTER

    Preview

    This feature is in Public Preview.

    Drops the row filter from the streaming table, if any. Future queries return all rows from the table without any automatic filtering.

  • SET TAGS ( { tag_name = tag_value } [, ...] )

    Preview

    This feature is in Public Preview.

    Apply tags to the streaming table. You need to have APPLY TAG permission to add tags to the streaming table.

    • tag_name

      A literal STRING. The tag_name must be unique within the streaming table.

    • tag_value

      A literal STRING.

  • UNSET TAGS ( tag_name [, ...] )

    Preview

    This feature is in Public Preview.

    Remove tags from the streaming table. You need to have APPLY TAG permission to remove tags from the streaming table.

    • tag_name

      A literal STRING. The tag_name must be unique within the streaming table.

ALTER COLUMN clause

Preview

This feature is in Public Preview.

Changes a property or the location of a column.

Syntax

  ALTER COLUMN
column_identifier
{ COMMENT comment |
SET MASK mask_clause |
DROP MASK |
SET TAGS clause |
UNSET TAGS clause }

Parameters

  • column_identifier

    The name of the column to be altered.

  • COMMENT comment

    Changes the description of the column_name column. comment must be a STRING literal.

  • SET MASK clause

    Adds a column mask function to anonymize sensitive data. All subsequent queries from that column will receive the result of evaluating that function over the column in place of the column's original value. This can be useful for fine-grained access control purposes where the function can inspect the identity or group memberships of the invoking user to determine whether to redact the value.

    Column masks added after creation only propagate to downstream tables after the next update. For continuous pipelines, this requires a pipeline restart.

  • DROP MASK

    Removes the column mask for this column, if any. Future queries from this column will receive the column's original values.

  • SET TAGS ( { tag_name = tag_value } [, ...] )

    Apply tags to the column. You need to have APPLY TAG permission to add tags to the column.

    • tag_name

      A literal STRING. The tag_name must be unique within the table or column.

    • tag_value

      A literal STRING.

  • UNSET TAGS ( tag_name [, ...] )

    Remove the given tags from the column. You need to have APPLY TAG permission to remove tags from the column.

    • tag_name

      A literal STRING. The tag_name must be unique within the table or column.

Examples

SQL
-- Adds a schedule to refresh the streaming table whenever its upstream data gets updated.
> ALTER STREAMING TABLE my_st
ADD TRIGGER ON UPDATE;

-- Alters the schedule to refresh the streaming table when its upstream data
-- gets updated, with no more than one update per hour.
> ALTER STREAMING TABLE my_st
ALTER TRIGGER ON UPDATE AT MOST EVERY INTERVAL 1 hour;

-- Alters the schedule to run every one hour.
> ALTER STREAMING TABLE my_st
ALTER SCHEDULE EVERY 1 HOUR;

-- Alters the schedule to refresh the streaming table once a day
-- at midnight in Los Angeles
> ALTER STREAMING TABLE my_st
ALTER SCHEDULE CRON '0 0 0 * * ? *' AT TIME ZONE 'America/Los_Angeles';

-- Drops the schedule for a streaming table
> ALTER STREAMING TABLE my_st
DROP SCHEDULE;

-- Adds a column comment
> ALTER STREAMING TABLE my_st
ALTER COLUMN column_name COMMENT 'test'