This page was exported from Latest Exam Prep [ http://certify.vceprep.com ] Export date:Sat Dec 14 9:53:36 2024 / +0000 GMT ___________________________________________________ Title: Best Salesforce Data-Cloud-Consultant Exam Practice Material Updated on Nov 13, 2024 [Q81-Q103] --------------------------------------------------- Best Salesforce Data-Cloud-Consultant Exam Practice Material Updated on Nov 13, 2024 New Data-Cloud-Consultant Actual Exam Dumps,  Salesforce Practice Test Q81. Which permission setting should a consultant check if the custom Salesforce CRM object is not available in New Data Stream configuration?  Confirm the Create object permission is enabled in the Data Cloud org.  Confirm the View All object permission is enabled in the source Salesforce CRM org.  Confirm the Ingest Object permission is enabled in the Salesforce CRM org.  Confirm that the Modify Object permission is enabled in the Data Cloud org. To create a new data stream from a custom Salesforce CRM object, the consultant needs to confirm that the View All object permission is enabled in the source Salesforce CRM org. This permission allows the user to view all records associated with the object, regardless of sharing settings1. Without this permission, the custom object will not be available in the New Data Stream configuration2. References:* Manage Access with Data Cloud Permission Sets* Object PermissionsQ82. Northern Trail Outfitters (NTO) wants to connect their B2C Commerce data with Data Cloud and bring two years of transactional history into Data Cloud.What should NTO use to achieve this?  B2C Commerce Starter Bundles  Direct Sales Order entity ingestion  Direct Sales Product entity ingestion  B2C Commerce Starter Bundles plus a custom extract ExplanationThe B2C Commerce Starter Bundles are predefined data streams that ingest order and product data from B2C Commerce into Data Cloud. However, the starter bundles only bring in the last 90 days of data by default. To bring in two years of transactional history, NTO needs to use a custom extract from B2C Commerce that includes the historical data and configure the data stream to use the custom extract as the source. The other options are not sufficient to achieve this because:* A. B2C Commerce Starter Bundles only ingest the last 90 days of data by default.* B. Direct Sales Order entity ingestion is not a supported method for connecting B2C Commerce data with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce data, only data ingestion.* C. Direct Sales Product entity ingestion is not a supported method for connecting B2C Commerce data with Data Cloud. Data Cloud does not provide a direct-access connection for B2C Commerce data, only data ingestion. References: Create a B2C Commerce Data Bundle – Salesforce, B2C Commerce Connector – Salesforce, Salesforce B2C Commerce Pricing Plans & CostsQ83. A Data Cloud customer wants to adjust their identity resolution rules to increase their accuracy of matches. Rather than matching on email address, they want to review a rule that joins their CRM Contacts with their Marketing Contacts, where both use the CRM ID as their primary key.Which two steps should the consultant take to address this new use case?Choose 2 answers  Map the primary key from the two systems to Party Identification, using CRM ID as the identification name for both.  Map the primary key from the two systems to party identification, using CRM ID as the identification name for individuals coming from the CRM, and Marketing ID as the identification name for individuals coming from the marketing platform.  Create a custom matching rule for an exact match on the Individual ID attribute.  Create a matching rule based on party identification that matches on CRM ID as the party identification name. ExplanationTo address this new use case, the consultant should map the primary key from the two systems to Party Identification, using CRM ID as the identification name for both, and create a matching rule based on party identification that matches on CRM ID as the party identification name. This way, the consultant can ensure that the CRM Contacts and Marketing Contacts are matched based on their CRM ID, which is a unique identifier for each individual. By using Party Identification, the consultant can also leverage the benefits of this attribute, such as being able to match across different entities and sources, and being able to handle multiple values for the same individual. The other options are incorrect because they either do not use the CRM ID as the primary key, or they do not use Party Identification as the attribute type. References: Configure Identity Resolution Rulesets, Identity Resolution Match Rules, Data Cloud Identity Resolution Ruleset, Data Cloud Identity Resolution Config InputQ84. Northern Trail Outfitters uses B2C Commerce and is exploring implementing Data Cloud to get a unifiedview of its customers and alltheir order transactions.What should the consultant keep in mind with regard to historical data ingesting order data using the B2C Commerce Order Bundle?  The B2C Commerce Order Bundle ingests 12 months of historical data.  The B2C Commerce Order Bundle ingests 6 months ofhistorical data.  The B2C Commerce Order Bundle does not ingest any historical data and only ingests new orders from that point on.  The B2C Commerce Order Bundle ingests 30 days ofhistorical data. ExplanationThe B2C Commerce Order Bundle is a data bundle that creates a data stream to flow order data from a B2C Commerce instance to Data Cloud. However, this data bundle does not ingest any historical data and only ingests new orders from the time the data stream is created. Therefore, if a consultant wants to ingest historical order data, they need to use a different method, such as exporting the data from B2C Commerce and importing it to Data Cloud using a CSV file12. References:* Create a B2C Commerce Data Bundle* Data Access and Export for B2C Commerce and Commerce MarketplaceQ85. Every day, Northern Trail Outfitters uploads a summary of the last 24 hours of store transactions to a new file in an Amazon S3 bucket, and files older than seven days are automatically deleted. Each file contains a timestamp in a standardized naming convention.Which two options should a consultant configure when ingesting this data stream?Choose 2 answers  Ensure that deletion of old files is enabled.  Ensure the refresh mode is set to “Upsert”.  Ensure the filename contains a wildcard toa accommodatethe timestamp.  Ensure the refresh mode is set to “Full Refresh.” ExplanationWhen ingesting data from an Amazon S3 bucket, the consultant should configure the following options:* The refresh mode should be set to “Upsert”, which means that new and updated records will be added or updated in Data Cloud, while existing records will be preserved. This ensures that the data is always up to date and consistent with the source.* The filename should contain a wildcard to accommodate the timestamp, which means that the file name pattern should include a variable part that matches the timestamp format. For example, if the file name is store_transactions_2023-12-18.csv, the wildcard could be store_transactions_*.csv. This ensures that the ingestion process can identify and process the correct file every day.The other options are not necessary or relevant for this scenario:* Deletion of old files is a feature of the Amazon S3 bucket, not the Data Cloud ingestion process. Data Cloud does not delete any files from the source, nor does it require the source files to be deleted after ingestion.* Full Refresh is a refresh mode that deletes all existing records in Data Cloud and replaces them with the records from the source file. This is not suitable for this scenario, as it would result indata loss and inconsistency, especially if the source file only contains the summary of the last 24 hours of* transactions. References: Ingest Data from Amazon S3, Refresh ModesQ86. Which data model subject area should be used for any Organization, Individual, or Member in the Customer360 data model?  Engagement  Membership  Party  Global Account ExplanationThe data model subject area that should be used for any Organization, Individual, or Member in the Customer360 data model is the Party subject area. The Party subject area defines the entities that are involved in any business transaction or relationship, such as customers, prospects, partners, suppliers, etc. The Party subject area contains the following data model objects (DMOs):* Organization: A DMO that represents a legal entity or a business unit, such as a company, a department, a branch, etc.* Individual: A DMO that represents a person, such as a customer, a contact, a user, etc.* Member: A DMO that represents the relationship between an individual and an organization, such as an employee, a customer, a partner, etc.The other options are not data model subject areas that should be used for any Organization, Individual, or Member in the Customer 360 data model. The Engagement subject area defines the actions that people take, such as clicks, views, purchases, etc. The Membership subject area defines the associations that people have with groups, such as loyalty programs, clubs, communities, etc. The Global Account subject area defines the hierarchical relationships between organizations, such as parent-child, subsidiary, etc.References:* Data Model Subject Areas* Party Subject Area* Customer 360 Data ModelQ87. A customer needs to integrate in real time with Salesforce CRM.Which feature accomplishes this requirement?  Streaming transforms  Data model triggers  Sales and Service bundle  Data actions and Lightning web components The correct answer is A. Streaming transforms. Streaming transforms are a feature of Data Cloud that allows real-time data integration with Salesforce CRM. Streaming transforms use the Data Cloud Streaming API to synchronize micro-batches of updates between the CRM data source and Data Cloud in near-real time1. Streaming transforms enable Data Cloud to have the most current and accurate CRM data for segmentation and activation2.The other options are incorrect for the following reasons:B: Data model triggers. Data model triggers are a feature of Data Cloud that allows custom logic to be executed when data model objects are created, updated, or deleted3. Data model triggers do not integrate data with Salesforce CRM, but rather manipulate data within Data Cloud.C: Sales and Service bundle. Sales and Service bundle is a feature of Data Cloud that allows pre-built data streams, data model objects, segments, and activations for Sales Cloud and Service Cloud data sources4. Sales and Service bundle does not integrate data in real time with Salesforce CRM, but rather ingests data at scheduled intervals.D: Data actions and Lightning web components. Data actions and Lightning web components are features of Data Cloud that allow custom user interfaces and workflows to be built and embedded in Salesforce applications5. Data actions and Lightning web components do not integrate data with Salesforce CRM, but rather display and interact with data within Salesforce applications.References:1: Load Data into Data Cloud2: [Data Streams in Data Cloud]3: [Data Model Triggers in Data Cloud] unit on Trailhead4: [Sales and Service Bundle in Data Cloud] unit on Trailhead5: [Data Actions and Lightning Web Components in Data Cloud] unit on Trailhead6: [Data Model in Data Cloud] unit on Trailhead7: [Create a Data Model Object] article on Salesforce Help8: [Data Sources in Data Cloud] unit on Trailhead9: [Connect and Ingest Data in Data Cloud] article on Salesforce Help10: [Data Spaces in Data Cloud] unit on Trailhead11: [Create a Data Space] article on Salesforce Help12: [Segments in Data Cloud] unit on Trailhead13: [Create a Segment] article on Salesforce Help14: [Activations in Data Cloud] unit on Trailhead15: [Create an Activation] article on Salesforce HelpQ88. What is the role of artificial intelligence (AI) in Data Cloud?  Automating data validation  Creating dynamic data-driven management dashboards  Enhancing customer interactions through insights and predictions  Generating email templates for use cases Role of AI in Data Cloud: Artificial intelligence (AI) plays a crucial role in Salesforce Data Cloud by leveraging data to generate insights and predictions that enhance customer interactions.Insights and Predictions:* AI Algorithms: Use machine learning algorithms to analyze vast amounts of customer data.* Predictive Analytics: Provide predictive insights, such as customer behavior trends, preferences, and potential future actions.Enhancing Customer Interactions:* Personalization: AI helps in creating personalized experiences by predicting customer needs and preferences.* Efficiency: Enables proactive customer service by predicting issues and suggesting solutions before customers reach out.* Marketing: Improves targeting and segmentation, ensuring that marketing efforts are directed towards the most promising leads and customers.Use Cases:* Recommendation Engines: Suggest products or services based on past behavior and preferences.* Churn Prediction: Identify customers at risk of leaving and engage them with retention strategies.References:* Salesforce Data Cloud AI Capabilities* Salesforce AI for Customer InteractionQ89. Which data model subject area defines the revenue or quantity for an opportunity by product family?  Engagement  Product  Party  Sales Order The Sales Order subject area defines the details of an order placed by a customer for one or more products or services. It includes information such as the order date, status, amount, quantity, currency, payment method, and delivery method. The Sales Order subject area also allows you to track the revenue or quantity for an opportunity by product family, which is a grouping of products that share common characteristics or features.For example, you can use the Sales Order Line Item DMO to associate each product in an order with its product family, and then use the Sales Order Revenue DMO to calculate the total revenue or quantity for each product family in an opportunity. References: Sales Order Subject Area, Sales Order Revenue DMO ReferenceQ90. Northern Trail Outfitters (NTO) wants to send a promotional campaign for customers that have purchased within the past 6 months. The consultant created a segment to meet this requirement.Now, NTO brings an additional requirement to suppress customers who have made purchases within the last week.What should the consultant use to remove the recent customers?  Batch transforms  Segmentation exclude rules  Related attributes  Streaming insight The consultant should use B. Segmentation exclude rules to remove the recent customers. Segmentation exclude rules are filters that can be applied to a segment to exclude records that meet certain criteria. The consultant can use segmentation exclude rules to exclude customers who have made purchases within the last week from the segment that contains customers who have purchased within the past 6 months. This way, the segment will only include customers who are eligible for the promotional campaign.The other options are not correct. Option A is incorrect because batch transforms are data processing tasks that can be applied to data streams or data lake objects to modify or enrich the data. Batch transforms are not used for segmentation or activation. Option C is incorrect because related attributes are attributes that are derived from the relationships between data model objects. Related attributes are not used for excluding records from a segment. Option D is incorrect because streaming insights are derived attributes that are calculated at the time of data ingestion. Streaming insights are not used for excluding records from a segment. References: Salesforce Data Cloud Consultant Exam Guide, Segmentation, Segmentation Exclude RulesQ91. A consultant is ingesting a list of employees from their human resources database that they want to segment on.Which data stream category should the consultant choose when ingesting this data?  Profile Data  Contact Data  Other Data  Engagement Data Categories of Data Streams:* Profile Data: Customer profiles and demographic information.* Contact Data: Contact points like email and phone numbers.* Other Data: Miscellaneous data that doesn’t fit into the other categories.* Engagement Data: Interactions and behavioral data.Q92. A client wants to bring in loyalty data from a custom object in Salesforce CRM that contains a point balance for accrued hotel points and airline points within the same record. The client wants to split these point systems into two separate records for better tracking and processing.What should a consultant recommend in this scenario?  Clone the data source object.  Use batch transforms to create a second data lake object.  Create a junction object in Salesforce CRM and modify the ingestion strategy.  Create a data kit from the data lake object and deploy it to the same Data Cloud org. ExplanationBatch transforms are a feature that allows creating new data lake objects based on existing data lake objects and applying transformations on them. This can be useful for splitting, merging, or reshaping data to fit the data model or business requirements. In this case, the consultant can use batch transforms to create a second data lake object that contains only the airline points from the original loyalty data object. The original object can be modified to contain only the hotel points. This way, the client can have two separate records for each point system and track and process them accordingly. References: Batch Transforms, Create a Batch TransformQ93. How does identity resolution select attributes for unified individuals when there Is conflicting information in the data model?  Creates additional contact points  Leverages reconciliation rules  Creates additional rulesets  Leverages match rules Identity resolution is the process of creating unified profiles of individuals by matching and merging data from different sources. When there is conflicting information in the data model, such as different names, addresses, or phone numbers for the same person, identity resolution leverages reconciliation rules to select the most accurate and complete attributes for the unified profile. Reconciliation rules are configurable rules that define how to resolve conflicts based on criteria such as recency, frequency, source priority, or completeness.For example, a reconciliation rule can specify that the most recent name or the most frequent phone number should be selected for the unified profile. Reconciliation rules can be applied at the attribute level or the contact point level. References: Identity Resolution, Reconciliation Rules, Salesforce Data Cloud Exam QuestionsQ94. Where is value suggestion for attributes in segmentation enabled when creating the DMO?  Data Mapping  Data Transformation  Segment Setup  Data Stream Setup ExplanationValue suggestion for attributes in segmentation is a feature that allows you to see and select the possible values for a text field when creating segment filters. You can enable or disable this feature for each data model object (DMO) field in the DMO record home. Value suggestion can be enabled for up to 500 attributes for your entire org. It can take up to 24 hours for suggested values to appear. To use value suggestion when creating segment filters, you need to drag the attribute onto the canvas and start typing in the Value field for an attribute. You can also select multiple values for some operators. Value suggestion is not available for attributes with morethan 255 characters or for relationships that are one-to-many (1:N). References: Use Value Suggestions in Segmentation, Considerations for Selecting Related AttributesQ95. Northern Trail Outfitters (NTO) is configuring an identity resolution ruleset based on Fuzzy Name and Normalized Email.What should NTO do to ensure the best email address is activated?  Include Contact Point Email object Is Active field as a match rule.  Use the source priority order in activations to make sure a contact point from the desired source is delivered to the activation target.  Ensure Marketing Cloud is prioritized as the first data source in the Source Priority reconciliation rule.  Set the default reconciliation rule to Last Updated. ExplanationNTO is using Fuzzy Name and Normalized Email as match rules to link together data from different sources into a unified individual profile. However, there might be cases where the same email address is available from more than one source, and NTO needs to decide which one to use for activation. For example, if Rachel has the same email address in Service Cloud and Marketing Cloud, but prefers to receive communications from NTO via Marketing Cloud, NTO needs to ensure that the email address from Marketing Cloud is activated. To do this, NTO can use the source priority order in activations, which allows them to rank the data sources in order of preference for activation. By placing Marketing Cloud higher than Service Cloud in the source priority order, NTO can make sure that the email address from Marketing Cloud is delivered to the activation target, such as an email campaign or a journey. This way, NTO can respect Rachel’s preference and deliver a better customer experience. References: Configure Activations, Use Source Priority Order in ActivationsQ96. A customer wants to create segments of users based on their Customer Lifetime Value.However, the source data that will be brought into Data Cloud does not include that key performance indicator (KPI).Which sequence of steps should the consultant follow to achieve this requirement?  Ingest Data > Map Data to Data Model > Create Calculated Insight > Use in Segmentation  Create Calculated Insight > Map Data to Data Model> Ingest Data > Use in Segmentation  Create Calculated Insight > Ingest Data > Map Data to Data Model> Use in Segmentation  Ingest Data > Create Calculated Insight > Map Data to Data Model > Use in Segmentation To create segments of users based on their Customer Lifetime Value (CLV), the sequence of steps that the consultant should follow is Ingest Data > Map Data to Data Model > Create Calculated Insight > Use in Segmentation. This is because the first step is to ingest the source data into Data Cloud using data streams1. The second step is to map the source data to the data model, which defines the structure and attributes of the data2. The third step is to create a calculated insight, which is a derived attribute that is computed based on the source or unified data3. In this case, the calculated insight would be the CLV, which can be calculated using a formula or a query based on the sales order data4. The fourth step is to use the calculated insight in segmentation, which is the process of creating groups of individuals or entities based on their attributes and behaviors. By using the CLV calculated insight, the consultant can segment the users by their predicted revenue from the lifespan of their relationship with the brand. The other options are incorrect because they do not follow the correct sequence of steps to achieve the requirement. Option B is incorrect because it is not possible to create a calculated insight before ingesting and mapping the data, as the calculated insight depends on the data model objects3. Option C is incorrect because it is not possible to create a calculated insight before mapping the data, as the calculated insight depends on the data model objects3. Option D is incorrect because it is not recommended to create a calculated insight before mapping the data, as the calculated insight may not reflect the correct data model structure and attributes3. References: Data Streams Overview, Data Model Objects Overview, Calculated Insights Overview, Calculating Customer Lifetime Value (CLV) With Salesforce, [Segmentation Overview]Q97. A consultant is reviewing a recent activation using engagement-based related attributes but is not seeing any related attributes in their payload for the majority of their segment members.Which two areas should the consultant review to help troubleshoot this issue?Choose 2 answers  The related engagement events occurred within the last 90 days.  The activations are referencing segments that segment on profile data rather than engagement data.  The correct path is selected for the related attributes.  The activated profiles have a Unified Contact Point. Engagement-based related attributes are attributes that describe the interactions of a person with an email message, such as opens, clicks, unsubscribes, etc. These attributes are stored in the Engagement data model object (DMO) and can be added to an activation to send more personalized communications. However, there are some considerations and limitations when using engagement-based related attributes, such as:For engagement data, activation supports a 90-day lookback window. This means that only the attributes from the engagement events that occurred within the last 90 days are considered for activation. Any records outside of this window are not included in the activation payload. Therefore, the consultant should review the event time of the related engagement events and make sure they are within the lookback window.The correct path to the related attributes must be selected for the activation. A path is a sequence of DMOs that are connected by relationships in the data model. For example, the path from Individual to Engagement is Individual -> Email -> Engagement. The path determines which related attributes are available for activation and how they are filtered. Therefore, the consultant should review the path selection and make sure it matches the desired related attributes and filters.The other two options are not relevant for this issue. The activations can reference segments that segment on profile data rather than engagement data, as long as the activation target supports related attributes. The activated profiles do not need to have a Unified Contact Point, which is a unique identifier for a person across different data sources, to activate engagement-based related attributes. References: Add Related Attributes to an Activation, Related Attributes in Data Cloud activation have no values, Explore the Engagement Data Model ObjectQ98. Which two dependencies need to be removed prior to disconnecting a data source?Choose 2 answers  Activation target  Segment  Activation  Data stream Dependencies in Data Cloud:* Before disconnecting a data source, all dependencies must be removed to prevent data integrity issues.Q99. During a privacy law discussion with a customer, the customer indicates they need to honor requests for the right to be forgotten. The consultant determines that Consent API will solve this business need.Which two considerations should the consultant inform the customer about?Choose 2 answers  Data deletion requests are reprocessed at 30, 60, and 90 days.  Data deletion requests are processed within 1 hour.  Data deletion requests are submitted for Individual profiles.  Data deletion requests submitted to Data Cloud are passed to all connected Salesforce clouds. ExplanationWhen advising a customer about using the Consent API in Salesforce to comply with requests for the right to be forgotten, the consultant should focus on two primary considerations:* Data deletion requests are submitted for Individual profiles (Answer C): The Consent API in Salesforce is designed to handle data deletion requests specifically for individual profiles. This means that when a request is made to delete data, it is targeted at the personal data associated with an individual’s profile in the Salesforce system. The consultant should inform the customer that the requests must be specific to individual profiles to ensure accurate processing and compliance with privacy laws.* Data deletion requests submitted to Data Cloud are passed to all connected Salesforce clouds (Answer D): When a data deletion request is made through the Consent API in Salesforce Data Cloud, the request is not limited to the Data Cloud alone. Instead, it propagates through all connected Salesforce clouds, such as Sales Cloud, Service Cloud, Marketing Cloud, etc. This ensures comprehensive compliance with the right to be forgotten across the entire Salesforce ecosystem. The customer should be aware that the deletion request will affect all instances of the individual’s data across the connected Salesforce environments.Q100. When performing segmentation or activation, which time zone is used to publish and refresh data?  Time zone specified on the activity at the time of creation  Time zone of the user creating the activity  Time zone of the Data Cloud Admin user  Time zone set by the Salesforce Data Cloud org ExplanationThe time zone that is used to publish and refresh data when performing segmentation or activation is D. Time zone set by the Salesforce Data Cloud org. This time zone is the one that is configured in the org settings when Data Cloud is provisioned, and it applies to all users and activities in Data Cloud. This time zone determines when the segments are scheduled to refresh and when the activations are scheduled to publish. Therefore, it is important to consider the time zone difference between the Data Cloud org and the destination systems or channels when planning the segmentation and activation strategies. References: Salesforce Data Cloud Consultant Exam Guide, Segmentation, ActivationQ101. Northern Trail Outfitters (NTO) is configuring an identity resolution ruleset based on Fuzzy Name and Normalized Email.What should NTO do to ensure the best email address is activated?  Include Contact Point Email object Is Active field as a match rule.  Use the source priority order in activations to make sure a contact point from the desired source is delivered to the activation target.  Ensure Marketing Cloud is prioritized as the first data source in the Source Priority reconciliation rule.  Set the default reconciliation rule to Last Updated. NTO is using Fuzzy Name and Normalized Email as match rules to link together data from different sources into a unified individual profile. However, there might be cases where the same email address is available from more than one source, and NTO needs to decide which one to use for activation. For example, if Rachel has the same email address in Service Cloud and Marketing Cloud, but prefers to receive communications from NTO via Marketing Cloud, NTO needs to ensure that the email address from Marketing Cloud is activated. To do this, NTO can use the source priority order in activations, which allows them to rank the data sources in order of preference for activation. By placing Marketing Cloud higher than Service Cloud in the source priority order, NTO can make sure that the email address from Marketing Cloud is delivered to the activation target, such as an email campaign or a journey. This way, NTO can respect Rachel’s preference and deliver a better customer experience. References: Configure Activations, Use Source Priority Order in ActivationsQ102. Luxury Retailers created a segment targeting high value customers that it activates through Marketing Cloud for email communication. The company notices that the activated count is smaller than the segment count.What is a reason for this?  Marketing Cloud activations apply a frequency cap and limit the number of records that can be sent in an activation.  Data Cloud enforces the presence of Contact Point for Marketing Cloud activations. If the individual does not have a related Contact Point, it will not be activated.  Marketing Cloud activations automatically suppress individuals who are unengaged and have not opened or clicked on an email in the last six months.  Marketing Cloud activations only activate those individuals that already exist in Marketing Cloud. They do not allow activation of new records. ExplanationData Cloud requires a Contact Point for Marketing Cloud activations, which is a record that links an individual to an email address. This ensures that the individual has given consent to receive email communications and that the email address is valid. If the individual does not have a related Contact Point, they will not be activated in Marketing Cloud. This may result in a lower activated count than the segment count. References: Data Cloud Activation, Contact Point for Marketing CloudQ103. A consultant needs to package Data Cloud components from oneorganization to another.Which two Data Cloud components should the consultant include in adata kit to achieve this goal?Choose 2 answers  Data model objects  Segments  Calculated insights  Identity resolution rulesets ExplanationTo package Data Cloud components from one organization to another, the consultant should include the following components in a data kit:* Data model objects: These are the custom objects that define the data model for Data Cloud, such as Individual, Segment, Activity, etc. They store the data ingested from various sources and enable the creation of unified profiles and segments1.* Identity resolution rulesets: These are the rules that determine how data from different sources are matched and merged to create unified profiles. They specify the criteria, logic, and priority for identity resolution2. References:* 1: Data Model Objects in Data Cloud* 2: Identity Resolution Rulesets in Data Cloud Loading … Study HIGH Quality Data-Cloud-Consultant Free Study Guides and Exams Tutorials: https://www.vceprep.com/Data-Cloud-Consultant-latest-vce-prep.html --------------------------------------------------- Images: https://certify.vceprep.com/wp-content/plugins/watu/loading.gif https://certify.vceprep.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2024-11-13 13:03:14 Post date GMT: 2024-11-13 13:03:14 Post modified date: 2024-11-13 13:03:14 Post modified date GMT: 2024-11-13 13:03:14