What Can Snowflake Data Scientist do in the Snowflake Marketplace as Provider?
A, B, C, D
Explanation:
All are correct!
About the Snowflake Marketplace
You can use the Snowflake Marketplace to discover and access third-party data and services, as well
as market your own data products across the Snowflake Data Cloud.
As a data provider, you can use listings on the Snowflake Marketplace to share curated data offer-ings
with many consumers simultaneously, rather than maintain sharing relationships with each indi-
vidual consumer. With Paid Listings, you can also charge for your data products.
As a consumer, you might use the data provided on the Snowflake Marketplace to explore and ac-
cess the following:
Historical data for research, forecasting, and machine learning.
Up-to-date streaming data, such as current weather and traffic conditions.
Specialized identity data for understanding subscribers and audience targets.
New insights from unexpected sources of data.
The Snowflake Marketplace is available globally to all non-VPS Snowflake accounts hosted on
Amazon Web Services, Google Cloud Platform, and Microsoft Azure, with the exception of Mi-crosoft
Azure Government. Support for Microsoft Azure Government is planned.
What Can Snowflake Data Scientist do in the Snowflake Marketplace as Consumer?
A, B, C, D
Explanation:
As a consumer, you can do the following:
· Discover and test third-party data sources.
· Receive frictionless access to raw data products from vendors.
· Combine new datasets with your existing data in Snowflake to derive new business insights.
· Have datasets available instantly and updated continually for users.
· Eliminate the costs of building and maintaining various APIs and data pipelines to load and up-date
data.
· Use the business intelligence (BI) tools of your choice.
Which one is the incorrect option to share data in Snowflake?
B
Explanation:
Options for Sharing in Snowflake
You can share data in Snowflake using one of the following options:
· a Listing, in which you offer a share and additional metadata as a data product to one or more ac-
counts,
· a Direct Share, in which you directly share specific database objects (a share) to another account in
your region,
· a Data Exchange, in which you set up and manage a group of accounts and offer a share to that
group.
Data providers add Snowflake objects (databases, schemas, tables, secure views, etc.) to a share us-
ing Which of the following options?
B, C
Explanation:
What is a Share?
Shares are named Snowflake objects that encapsulate all of the information required to share a
database.
Data providers add Snowflake objects (databases, schemas, tables, secure views, etc.) to a share
using either or both of the following options:
Option 1: Grant privileges on objects to a share via a database role.
Option 2: Grant privileges on objects directly to a share.
You choose which accounts can consume data from the share by adding the accounts to the share.
After a database is created (in a consumer account) from a share, all the shared objects are
accessible to users in the consumer account.
Shares are secure, configurable, and controlled completely by the provider account:
· New objects added to a share become immediately available to all consumers, providing real-time
access to shared data.
Access to a share (or any of the objects in a share) can be revoked at any time.
Secure Data Sharing do not let you share which of the following selected objects in a database in your
account with other Snowflake accounts?
A
Explanation:
Secure Data Sharing lets you share selected objects in a database in your account with other Snow-
flake accounts. You can share the following Snowflake database objects:
Tables
External tables
Secure views
Secure materialized views
Secure UDFs
Snowflake enables the sharing of databases through shares, which are created by data providers and
“imported” by data consumers.
Which one is incorrect understanding about Providers of Direct share?
D
Explanation:
If you want to provide a share to many accounts, you might want to use a listing or a data ex-change.
As Data Scientist looking out to use Reader account, Which ones are the correct considerations about
Reader Accounts for Third-Party Access?
D
Explanation:
Data sharing is only supported between Snowflake accounts. As a data provider, you might want to
share data with a consumer who does not already have a Snowflake account or is not ready to be-
come a licensed Snowflake customer.
To facilitate sharing data with these consumers, you can create reader accounts. Reader accounts
(formerly known as “read-only accounts”) provide a quick, easy, and cost-effective way to share data
without requiring the consumer to become a Snowflake customer.
Each reader account belongs to the provider account that created it. As a provider, you use shares to
share databases with reader accounts; however, a reader account can only consume data from the
provider account that created it.
So, Data Sharing is possible between Snowflake & Non-snowflake accounts via Reader Account.
A Data Scientist as data providers require to allow consumers to access all databases and database
objects in a share by granting a single privilege on shared databases. Which one is incorrect SnowSQL
command used by her while doing this task?
Assuming:
A database named product_db exists with a schema named product_agg and a table named
Item_agg.
The database, schema, and table will be shared with two accounts named xy12345 and yz23456.
1. USE ROLE accountadmin;
2. CREATE DIRECT SHARE product_s;
3. GRANT USAGE ON DATABASE product_db TO SHARE product_s;
4. GRANT USAGE ON SCHEMA product_db. product_agg TO SHARE product_s;
5. GRANT SELECT ON TABLE sales_db. product_agg.Item_agg TO SHARE product_s;
6. SHOW GRANTS TO SHARE product_s;
7. ALTER SHARE product_s ADD ACCOUNTS=xy12345, yz23456;
8. SHOW GRANTS OF SHARE product_s;
C
Explanation:
CREATE SHARE product_s is the correct Snowsql command to create Share object.
Rest are correct ones.
https://docs.snowflake.com/en/user-guide/data-sharing-provider#creating-a-share-using-sql
Which object records data manipulation language (DML) changes made to tables, including inserts,
updates, and deletes, as well as metadata about each change, so that actions can be taken using the
changed data of Data Science Pipelines?
C
Explanation:
A stream object records data manipulation language (DML) changes made to tables, including
inserts, updates, and deletes, as well as metadata about each change, so that actions can be taken
using the changed data. This process is referred to as change data capture (CDC). An individual table
stream tracks the changes made to rows in a source table. A table stream (also referred to as simply a
“stream”) makes a “change table” available of what changed, at the row level, between two
transactional points of time in a table. This allows querying and consuming a sequence of change
records in a transactional fashion.
Streams can be created to query change data on the following objects:
· Standard tables, including shared tables.
· Views, including secure views
· Directory tables
· Event tables
Which are the following additional Metadata columns Stream contains that could be used for
creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?
A, C, E
Explanation:
A stream stores an offset for the source object and not any actual table columns or data. When que-
ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the
same column names and ordering) with the following additional columns:
METADATA$ACTION
Indicates the DML operation (INSERT, DELETE) recorded.
METADATA$ISUPDATE
Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source
object are represented as a pair of DELETE and INSERT records in the stream with a metadata column
METADATA$ISUPDATE values set to TRUE.
Note that streams record the differences between two offsets. If a row is added and then updated in
the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE
value.
METADATA$ROW_ID
Specifies the unique and immutable ID for the row, which can be used to track changes to specific
rows over time.