Universal Containers (UC) uses Salesforce for tracking opportunities (Opportunity). UC uses an
internal ERP system for tracking deliveries and invoicing. The ERP system supports SOAP API and
OData for bi-directional integration between Salesforce and the ERP system. UC has about one
million opportunities. For each opportunity, UC sends 12 invoices, one per month. UC sales reps have
requirements to view current invoice status and invoice amount from the opportunity page. When
creating an object to model invoices, what should the architect recommend, considering
performance and data storage space?
B
Explanation:
Creating an external object Invoice_x with a Lookup relationship with Opportunity is the best option
for modeling invoices, considering performance and data storage space. An external object allows
the data to be stored in the ERP system and accessed via OData in Salesforce. This reduces the data
storage consumption in Salesforce and improves the performance of queries and reports. A Lookup
relationship allows the sales reps to view the invoice status and amount from the opportunity
page.
The other options would either consume more data storage space, require additional
customization, or not provide real-time data access
Universal Containers has a large number of Opportunity fields (100) that they want to track field
history on. Which two actions should an architect perform in order to meet this requirement?
Choose 2 answers
A, B
Explanation:
Creating a custom object to store a copy of the record when changed and creating a custom object to
store the previous and new field values are two possible actions that an architect can perform to
meet the requirement of tracking field history on 100 Opportunity fields. A custom object can store
more fields and records than the standard field history tracking feature, which has a limit of 20 fields
per object and 18 or 24 months of data retention. A custom object can also be used for reporting and
analysis of field history data.
The other options are not feasible or effective for meeting the
requirement
DreamHouse Realty has a Salesforce org that is used to manage Contacts.
What are two things an Architect should consider using to maintain data quality in this situation?
(Choose two.)
B, C
Explanation:
Using Salesforce duplicate management and using validation rules on new record create and edit are
two things that an architect should consider using to maintain data quality for managing Contacts.
Salesforce duplicate management allows the architect to create matching rules and duplicate rules to
identify, prevent, or allow duplicate records based on various criteria. Validation rules allow the
architect to enforce data quality standards and business logic by displaying error messages when
users try to save invalid data.
The other options are not relevant or helpful for maintaining data
quality
Universal Containers is looking to use Salesforce to manage their sales organization. They will be
migrating legacy account data from two aging systems into Salesforce. Which two design
considerations should an architect take to minimize data duplication? Choose 2 answers
B, C
Explanation:
Cleaning data before importing to Salesforce and using Salesforce matching and duplicate rules are
two design considerations that an architect should take to minimize data duplication when migrating
legacy account data from two aging systems into Salesforce. Cleaning data before importing involves
removing or correcting any inaccurate, incomplete, or inconsistent data from the source systems, as
well as identifying and resolving any potential duplicates. This ensures that only high-quality and
unique data is imported to Salesforce. Using Salesforce matching and duplicate rules allows the
architect to define how Salesforce identifies duplicate records during import and how users can
handle them. This prevents or reduces the creation of duplicate records in Salesforce and improves
data quality. The other options are not effective or recommended for minimizing data duplication.
Universal Containers (UC) has a Salesforce instance with over 10.000 Account records. They have
noticed similar, but not identical. Account names and addresses. What should UC do to ensure
proper data quality?
C
Explanation:
Enabling Account de-duplication by creating matching rules in Salesforce, which will mass merge
duplicate Accounts, is what UC should do to ensure proper data quality for their Account records.
Matching rules allow UC to define how Salesforce identifies duplicate Accounts based on various
criteria, such as name, address, phone number, etc. Mass merge allows UC to merge up to 200
duplicate Accounts at a time, based on the matching rules. This simplifies and automates the process
of de-duplicating Accounts and improves data quality. The other options are either more time-
consuming, costly, or error-prone for ensuring proper data quality.
Cloud Kicks stores Invoice records in a custom object. Invoice records are being sent to the
Accounting department with missing States and incorrectly formatted Postal Codes.
Which two actions should Cloud Kicks take to improve data quality? (Choose two.)
C, D
Explanation:
Utilizing a Validation Rule with a REGEX operator on Postal Code and utilizing a Validation Rule with a
CONTAINS operator on address fields are two actions that Cloud Kicks should take to improve data
quality for their Invoice records. A Validation Rule with a REGEX operator can check if the Postal Code
field matches a specific pattern or format, such as a five-digit number or a combination of letters and
numbers. A Validation Rule with a CONTAINS operator can check if the address fields contain certain
values, such as valid state abbreviations or country names. These Validation Rules can prevent users
from saving invalid or incomplete data and display error messages to guide them to correct the
data.
The other options are not effective or recommended for improving data quality, as they would
either require additional customization, not enforce data standards, or not address the specific issues
of missing states and incorrectly formatted postal codes
Universal Containers (UC) has multi -level account hierarchies that represent departments within
their major Accounts. Users are creating duplicate Contacts across multiple departments. UC wants
to clean the data so as to have a single Contact across departments. What two solutions should UC
implement to cleanse their data? Choose 2 answers
A, B
Explanation:
Making use of a third-party tool to help merge duplicate Contacts across Accounts and using
Data.com to standardize Contact address information to help identify duplicates are two solutions
that UC should implement to cleanse their data and have a single Contact across departments. A
third-party tool, such as an app from the AppExchange, can provide advanced features and
capabilities for finding and merging duplicate Contacts across different Accounts, based on various
criteria and rules. Data.com can provide address verification and standardization services that can
enhance the quality and consistency of Contact address information and make it easier to identify
duplicates.
The other options are not feasible or effective for cleansing the data, as they would either
not work across different Accounts, not address the root cause of duplication, or not provide
sufficient functionality for merging duplicates
Universal Containers has defined a new Data Quality Plan for their Salesforce data and wants to know
how they can enforce it throughout the organization. Which two approaches should an architect
recommend to enforce this new plan?
Choose 2 answers
A, B
Explanation:
Scheduling a weekly dashboard displaying records that are missing information to be sent to
managers for review and using Workflow, Validation Rules, and Force.com code (Apex) to enforce
critical business processes are two approaches that an architect should recommend to enforce the
new Data Quality Plan for UC’s Salesforce data. Scheduling a weekly dashboard can provide a regular
and visual way of monitoring the data quality and identifying any gaps or issues that need to be
addressed by the managers or users. Using Workflow, Validation Rules, and Apex can provide various
ways of enforcing data quality standards and business logic by automating actions, displaying error
messages, or executing custom code when users create or edit records.
The other options are not
suitable or helpful for enforcing the Data Quality Plan, as they would either not provide real-time
feedback, not prevent data quality issues, or not leverage the capabilities of Salesforce
Universal Containers wants to implement a data -quality process to monitor the data that users are
manually entering into the system through the Salesforce UI. Which approach should the architect
recommend?
C
Explanation:
Utilizing an app from the AppExchange to create data-quality dashboards is the approach that the
architect should recommend for UC to implement a data-quality process to monitor the data that
users are manually entering into the system through the Salesforce UI. An app from the
AppExchange can provide ready-made or customizable dashboards that can display various metrics
and indicators of data quality, such as completeness, accuracy, consistency, timeliness, etc. These
dashboards can help UC to measure and evaluate their data quality performance and identify any
areas that need improvement or attention. The other options are not relevant or effective for
implementing a data-quality process, as they would either not address the issue of manual data
entry, not provide data-quality monitoring, or not leverage the benefits of Salesforce.
A manager at Cloud Kicks is importing Leads into Salesforce and needs to avoid creating duplicate
records.
Which two approaches should the manager take to achieve this goal? (Choose two.)
A, B
Explanation:
Acquiring an AppExchange Lead de-duplication application and implementing Salesforce Matching
and Duplicate Rules are two approaches that the manager at Cloud Kicks should take to avoid
creating duplicate records when importing Leads into Salesforce. An AppExchange Lead de-
duplication application can provide additional features and functionality for finding and preventing
duplicate Leads during import, such as fuzzy matching, custom rules, mass merge, etc. Salesforce
Matching and Duplicate Rules can allow the manager to define how Salesforce identifies duplicate
Leads based on various criteria and how users can handle them during import, such as blocking,
allowing, or alerting them. The other options are not feasible or effective for avoiding duplicate
records, as they would either not work during import, not provide de-duplication capabilities, or
require additional customization.