This page was exported from Latest Exam Prep [ http://certify.vceprep.com ] Export date:Sat Sep 21 14:13:22 2024 / +0000 GMT ___________________________________________________ Title: Accurate Hot Selling Data-Architect Exam Dumps 2023 Newly Released [Q40-Q60] --------------------------------------------------- Accurate Hot Selling Data-Architect Exam Dumps 2023 Newly Released Get 100% Authentic Salesforce Data-Architect Dumps with Correct Answers Salesforce Data-Architect Exam Syllabus Topics: TopicDetailsTopic 1Recommend appropriate approaches and techniques to capture and maintain customer reference & metadata Compare and contrast various techniques for improving performance when migrating large data volumes into SalesforceTopic 2Decide when to use virtualised data and describe virtualised data options Compare and contrast various techniques and considerations for exporting data from SalesforceTopic 3Describe techniques to represent a single view of the customer on the Salesforce platform Recommend appropriate techniques and methods for ensuring high data quality at load timeTopic 4Recommend and use techniques for establishing a "golden record" or "system of truth" Recommend a data archiving and purging plan that is optimal for customer's data storage management needsTopic 5Recommend a design to effectively consolidate andor leverage data from multiple Salesforce instances Compare and contrast various techniques, approaches and considerations   NO.40 Universal Containers wishes to maintain Lead data from Leads even after they are deleted and cleared from the Recycle Bin. What approach should be implemented to achieve this solution?  Use a Converted Lead report to display data on Leads that have been deleted.  Send data to a Data Warehouse and mark Leads as deleted in that system.  Use a Lead standard report and filter on the IsDeleted standard field.  Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords. NO.41 North Trail Outfitters (NTO) operates a majority of its business from a central Salesforce org, NTO also owns several secondary orgs that the service, finance, and marketing teams work out of, At the moment, there is no integration between central and secondary orgs, leading to data-visibility issues.Moving forward, NTO has identified that a hub-and-spoke model is the proper architect to manage its data, where the central org is the hub and the secondary orgs are the spokes.Which tool should a data architect use to orchestrate data between the hub org and spoke orgs?  A backup and archive solution that extracts and restores data across orgs.  Develop custom APIs to poll the spoke for change data and push into the org.  Develop custom APIs to poll the hub org for change data and push into the spoke orgs.  A middleware solution that extracts and distributes data across both the hub and spokes. NO.42 Universal Containers (UC) has 1,000 accounts and 50,000 opportunities. UC has an enterprise security requirement to export all sales data outside of Salesforce on a weekly basis. The security requirement also calls for exporting key operational data that includes events such as file downloads, logins, logouts, etc. Which two recommended approaches would address the above requirement?  Use Event Monitoring to extract event data to on-premise systems.  Use a custom built extract job to extract operational data to on-premise systems.  Use Weekly Export to extract transactional data to on-premise systems.  Use Field Audit History to capture operational data and extract it to on-premise systems. NO.43 Universal Containers (UC) has implemented Salesforce, UC is running out of storage and needs to have an archiving solution, UC would like to maintain two years of data in Saleforce and archive older data out of Salesforce.Which solution should a data architect recommend as an archiving solution?  Build a batch join move all records off platform, and delete all records from Salesforce.  Use a third-party backup solution to backup all data off platform.  Build a batch job to move all restore off platform, and delete old records from Salesforce.  Build a batch join to move two-year-old records off platform, and delete records from Salesforce. NO.44 DreamHouse Realty has a data model as shown in the image. The Project object has a private sharing model, and it has Roll-Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project.There will be a large amount of time entry records to be loaded regularly from an external system into Salesforce.What should the Architect consider in this situation?  Calculate summary values instead of Roll-Up by using workflow.  Load all data after deferring sharing calculations.  Load all data using external IDs to link to parent records.  Calculate summary values instead of Roll-Up by using triggers. NO.45 Universal Containers is planning out their archiving and purging plans going forward for their custom objects Topic__c and Comment__c. Several options are being considered, including analytics snapshots, offsite storage, scheduled purges, etc. Which three questions should be considered when designing an appropriate archiving strategy?  How many fields are defined on the custom objects that need to be archived?  Are there any regulatory restrictions that will influence the archiving and purging plans?  Will the data being archived need to be reported on or accessed in any way in the future?  Which profiles and users currently have access to these custom object records?  If reporting is necessary, can the information be aggregated into fewer, summary records? NO.46 An Architect needs to document the data architecture for a multi-system, enterprise Salesforce implementation.Which two key artifacts should the Architect use? (Choose two.)  User stories  Non-functional requirements  Integration specification  Data model NO.47 NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records.In salesforce, the customer is represented by the contact object.NTO utilizes an MDM solution with these attributes:1. The MDM solution keeps track of customer master with a master key.2. The master key is a map to the record ID’s from each external system that customer data is stored within.3. The MDM solution provides de-duplication features, so it acts as the single source of truth.How should a data architect implement the storage of master key within salesforce?  Create a custom object to store the master key with a lookup field to contact.  Store the master key on the contact object as an external ID (Field for referential imports)  Create an external object to store the master key with a lookup field to contact.  Store the master key in Heroku postgres and use Heroku connect for synchronization. NO.48 UC is having issues using Informatica Cloud Louder to export +10MOrder records. Each Order record has 10 Order Line Items. What two steps can you take to help correct this? Choose two answers.  Export in multiple batches  Use PK Chunking  Limit Batch to 10K records  Export Bulk API in parallel mode NO.49 Universal Containers (UC) is a major supplier of office supplies. Some products are produced by UC and some by other manufacturers. Recently, a number of customers have complained that product descriptions on the invoices do not match the descriptions in the online catalog and on some of the order confirmations (e.g., “ballpoint pen” in the catalog and “pen” on the invoice, and item color labels are inconsistent: “what vs. “White” or “blk” vs. “Black”). All product data is consolidated in the company data warehouse and pushed to Salesforce to generate quotes and invoices. The online catalog and webshop is a Salesforce Customer Community solution. What is a correct technique UC should use to solve the data inconsistency?  Add custom fields to the Product standard object in Salesforce to store data from the different source systems.  Build Apex Triggers in Salesforce that ensure products have the correct names and labels after data is loaded into salesforce.  Define a data taxonomy for product data and apply the taxonomy to the product data in the data warehouse.  Change integration to let product master systems update product data directly in Salesforce via the Salesforce API. NO.50 Universal Containers is integrating a new Opportunity engagement system with Salesforce. According to their Master Data Management strategy, Salesforce is the system of record for Account, Contact, and Opportunity dat a. However, there does seem to be valuable Opportunity data in the new system that potentially conflicts with what is stored in Salesforce. What is the recommended course of action to appropriately integrate this new system?  Stakeholders should be brought together to discuss the appropriate data strategy moving forward.  The Opportunity engagement system should become the system of record for Opportunity records.  The MDM strategy defines Salesforce as the system of record, so Salesforce Opportunity values prevail in all conflicts.  A policy should be adopted so that the system whose record was most recently updated should prevail in conflicts. NO.51 Two million Opportunities need to be loaded in different batches into Salesforce using the Bulk API in parallel mode.What should an Architect consider when loading the Opportunity records?  Use the Name field values to sort batches.  Group batches by the AccountId field.  Create indexes on Opportunity object text fields.  Order batches by Auto-number field. NO.52 Universal Containers (UC) maintains a collection of several million Account records that represent business in the United Sates. As a logistics company, this list is one of the most valuable and important components of UC’s business, and the accuracy of shipping addresses is paramount. Recently it has been noticed that too many of the addresses of these businesses are inaccurate, or the businesses don’t exist. Which two scalable strategies should UC consider to improve the quality of their Account addresses?  Contact each business on the list and ask them to review and update their address information.  Build a team of employees that validate Accounts by searching the web and making phone calls.  Leverage Data.com Clean to clean up Account address fields with the D&B database.  Integrate with a third-party database or services for address validation and enrichment. NO.53 A shipping and logistics company has created a large number of reports within Sales Cloud since Salesforce was introduced. Some of these reports analyze large amounts of data regarding the whereabouts of the company’s containers, and they are starting to time out when users are trying to run the reports. What is a recommended approach to avoid these time-out issues?  Improve reporting performance by replacing the existing reports in Sales Cloud with new reports based on Analytics Cloud.  Improve reporting performance by creating a custom Visualforce report that is using a cache of the records in the report.  Improve reporting performance by creating a dashboard that is scheduled to run the reports only once per day.  Improve reporting performance by creating an Apex trigger for the Report object that will pre-fetch data before the report is run. NO.54 NTO has decided that it is going to build a channel sales portal with the following requirements:1. External resellers are able to authenticate to the portal with a login.2. Lead data, opportunity data and order data are available to authenticated users.3. Authenticated users many need to run reports and dashboards.4. There is no need for more than 10 custom objects or additional file storage.Which community cloud license type should a data architect recommend to meet the portal requirements?  Customer community plus.  Lightning external apps starter.  Customer community.  Partner community. NO.55 Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time Which two first steps would be appropriate for UC to initiate an assessment of data architecture? Choose 2 answers  Engage with executive sponsorship to assess enterprise data strategy and goals.  Engage with business units and IT to assess current operational systems and data models.  Engage with database administrators to assess current database performance metrics.  Engage with IT program managers to assess current velocity of projects in the pipeline. NO.56 Universal Containers (UC) has multi -level account hierarchies that represent departments within their major Accounts. Users are creating duplicate Contacts across multiple departments. UC wants to clean the data so as to have a single Contact across departments. What two solutions should UC implement to cleanse their data? Choose 2 answers  Make use of a third -party tool to help merge duplicate Contacts across Accounts.  Make use of the Merge Contacts feature of Salesforce to merge duplicates for an Account.  Use Data.com to standardize Contact address information to help identify duplicates.  Use Workflow rules to standardize Contact information to identify and prevent duplicates. NO.57 Universal Containers (UC) is launching an RFP to acquire a new accounting product available on AppExchange. UC is expecting to issue 5 million invoices per year, with each invoice containing an average of 10 line items. What should UC’s Data Architect recommend to ensure scalability?  Ensure the account product vendor provides a sound data archiving strategy.  Ensure the account product vendor includes Wave Analytics in their offering.  Ensure invoice line items simply reference existing Opportunity line items.  Ensure the accounting product runs 100% natively on the Salesforce platform. NO.58 In their legacy system. Universal Containers has a monthly accounts receivable report that compiles data from Accounts, Contacts, Opportunities, Orders. and Order Line Items. What difficulty will an architect run into when implementing this in Salesforce?  Custom report types cannot contain Opportunity data.  Salesforce allows up to four objects in a single report type.  A report cannot contain data from Accounts and Contacts.  Salesforce does not support Orders or Order Line Items. NO.59 Universal Containers has more than 10 million records in the Order_c object. The query has timed out when running a bulk query. What should be considered to resolve query timeout?  Streaming API  Tooling API  PK Chunking  Metadata API NO.60 UC is migrating individual customers (B2C) data from legacy systems to SF. There are millions of customers stored as accounts and contacts in legacy database.Which object model should a data architect configure within SF ?  Leverage custom account and contact object in SF  Leverage custom person account object in SF  Leverage standard account and contact object in SF  Leverage person account object in Salesforce  Loading … Dumps of Data-Architect Cover all the requirements of the Real Exam: https://www.vceprep.com/Data-Architect-latest-vce-prep.html --------------------------------------------------- Images: https://certify.vceprep.com/wp-content/plugins/watu/loading.gif https://certify.vceprep.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2023-04-19 10:58:34 Post date GMT: 2023-04-19 10:58:34 Post modified date: 2023-04-19 10:58:34 Post modified date GMT: 2023-04-19 10:58:34