Siebel - Issue With Activity Templates - siebel

I am fairly new to Siebel. There is an issue in which -
A Template is created in the Activity Plan List Applet and certain Activities are added in the Activity Plan Action Applet.
But i dont find those activities available in the LN Account Activity List Applet.(This Applet is available in the Account Summary Tab of the Accounts Screen)
Business Components and associated Applets:
1.Activity Plan - Activity Plan List Applet
2.Activity Plan Action - Activity Plan Action Applet
3.LN Action - LN Account Activity List Applet
I chechked the link between the business components:
Links:
Account/LN Action - Source Field(Id), Dest Field(Primary Coverage Company Id)
Account/Activity Plan - Source Field(Id), Dest Field(Account Id)
I need to auto-populate the Field - Primary Coverage Company Id.
How do i proceed with the auto-population?

Related

Updating a field from a different DAC

I created a customization on the service order as shown below. When Mark for SO is active, CREATE SALES ORDER button appears. The button enable creation of a service management sales order.
The service management sales order is as follows;
How do I get the STATUS and ORDERNBR of service management sales order created to be displayed on Servce order screen on the Details Line on my custom fields i.e SOStatus(UsrSOStatus). also have another custom field as SONumber(UsrSONumber).

How to deduplicate GCP logs from Logs Explorer?

I am using GCP Logs explorer to store logging messages from my pipeline.
I need to debug an issue by looking at logs from a specific event. The message of this error is identical except for an event ID at the end.
So for example, the error message is
event ID does not exist: foo
I know that I can use the following syntax to construct a query that will return the logs with this particular message structure
resource.type="some_resource"
resource.labels.project_id="some_project"
resource.labels.job_id="some_id"
severity=WARNING
jsonPayload.message:"Event ID does not exist:"
The last line in that query will then return every log where the message has that string.
I end up with a result like this
Event ID does not exist: 1A
Event ID does not exist: 2A
Event ID does not exist: 2A
Event ID does not exist: 3A
so I wish to deduplicate that to end up with only
Event ID does not exist: 1A
Event ID does not exist: 2A
Event ID does not exist: 3A
But I don't see support for this type of deduplication in the language docs
Due to the amount of rows, I also cannot download a delimited log file.
Is it possible to deduplicate the amount of rows?
To deduplicate records with BigQuery, follow these steps:
Identify whether your dataset contains duplicates.
Create a SELECT query that aggregates the desired column using a
GROUP BY clause.
Materialize the result to a new table using CREATE OR REPLACE TABLE [tablename] AS [SELECT STATEMENT].
You can review the full tutorial in this link.
To analyze a big quantity of logs, you could route them to BigQuery and analyze the logs using Fluentd.
Fluentd has an output plugin that can use BigQuery as a destination for storing the collected logs. Using the plugin, you can directly load logs into BigQuery in near real time from many servers.
In this link, you can find a complete tutorial on how to Analyze logs using Fluentd and BigQuery.
To route your logs to BigQuery, first it is necessary to create a sink and route it to BigQuery.
Sinks control how Cloud Logging routes logs. Using sinks, you can
route some or all of your logs to supported destinations.
Sinks belong to a given Google Cloud resource: Cloud projects, billing
accounts, folders, and organizations. When the resource receives a log
entry, it routes the log entry according to the sinks contained by
that resource. The log entry is sent to the destination associated
with each matching sink.
You can route log entries from Cloud Logging to BigQuery using sinks.
When you create a sink, you define a BigQuery dataset as the
destination. Logging sends log entries that match the sink's rules to
partitioned tables that are created for you in that BigQuery dataset.
1) In the Cloud console, go to the Logs Router page:
2) Select an existing Cloud project.
3) Select Create sink.
4) In the Sink details panel, enter the following details:
Sink name: Provide an identifier for the sink; note that after you create the sink, you can't rename the sink but you can delete it and
create a new sink.
Sink description (optional): Describe the purpose or use case for the sink.
5) In the Sink destination panel, select the sink service and destination:
Select sink service: Select the service where you want your logs routed. Based on the service that you select, you can select from the
following destinations:
BigQuery table: Select or create the particular dataset to receive the routed logs. You also have the option to use partitioned tables.
For example, if your sink destination is a BigQuery dataset, the sink
destination would be the following:
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID
Note that if you are routing logs between Cloud projects, you still
need the appropriate destination permissions.
6) In the Choose logs to include in sink panel, do the following:
In the Build inclusion filter field, enter a filter expression that
matches the log entries you want to include. If you don't set a
filter, all logs from your selected resource are routed to the
destination.
To verify you entered the correct filter, select Preview logs. This
opens the Logs Explorer in a new tab with the filter prepopulated.
7) (Optional) In the Choose logs to filter out of sink panel, do the following:
In the Exclusion filter name field, enter a name.
In the Build an exclusion filter field, enter a filter expression that
matches the log entries you want to exclude. You can also use the
sample function to select a portion of the log entries to exclude. You
can create up to 50 exclusion filters per sink. Note that the length
of a filter can't exceed 20,000 characters.
8) Select Create sink.
More information about Configuring and managing sinks here.
To review details, the formatting, and rules that apply when routing log entries from Cloud Logging to BigQuery, please follow this link.

sitecore publish item from web to master

I am working with Sitecore Intranet Portal. I am using web database for CD.
If a user change his email Id. How would I publish this to master database.
I am using this code to publish item from web to master db.
// The publishOptions determine the source and target database,
// the publish mode and language, and the publish date
var publishOptions =
new PublishOptions(Database.GetDatabase("web"), Database.GetDatabase("master"),
PublishMode.SingleItem,
item.Language,
DateTime.Now);
var publisher = new Publisher(publishOptions);
// Choose where to publish from
publisher.Options.RootItem = item;
// Publish children as well?
publisher.Options.Deep = true;
// Do the publish!
publisher.Publish();
It would be nice to know the correct procedure.
To publish from web to master is a bad practice.
This kind of content is named User Generated Content. I suggested to use this approach :
https://sitecore.unic.com/2015/07/16/user-generated-content-in-a-security-hardened-environment
Or you can use webservices. This allows all content (even user-generated) to be authored in your CM instance. This allows you to leverage the capabilities of the platform for workflow, publishing, etc.
Either use an external database or use Sitecore users for a certain domain and store everything in the core database. The core database is commonly shared between all environments
You can make use of package to transfer your item from web to master. You can follow the following steps:
Create your item package from web.
Use the tool that i have developed to convert the destination of the items. That is, change from web to master. The tool is found on marketplace at Sitecore package modifier.
Install the modified package on the master database.
Another solution is to have a schedule job that create the package from web, modified it and install it on the master database. This hence require no manual intervention.
I prefer to use this method than the sitecore transfer method because the transfer method tends to timeout or cause an error if there are lots of items to transfer.
The best practice to store the user information in the Core database. Sitecore is using ASP.NET membership to manage the user accounts. I recommend you to store all the user info in the core DB.
Sitecore doesn't provide any option to publish data from web to master, however, it has provides you an option to transfer data from web DB to master DB.
Check out my blog Transfer Items from Web to Master Database for complete details on transferring the data from one DB to another DB
Step 1: Login to Sitecore desktop mode.
Step 2: Select the source database from where you need to transfer the items. If you want to transfer from Web to Master then select Web Database in the right-hand bottom corner.
Step 3: Open the content editor and select the item which you would like to transfer.
Step 4: Right click on the Item and select Copying>>Transfer and click on the transfer button
Step 5: In the Transfer window verify if the source item is selected properly or not.
Step 6: Click on Next to continue. In this window first thing is to select the destination database to which you need to transfer. The second thing is to select the parent item or the destination folder where you need to place this item. In my case, I need to select Layout because sublayouts are present in the layouts item.
Step 7: Click on next, here you will get an option which says include subitems. Select the checkbox if you need to restore the subitems also else uncheck the box to transfer only the selected item.
Step 8: Click on transfer button and it will start the transfer process. Close the dialog box once the transfer is completed.

Set Sharepoint task due date based on workflow status

I'm new to SharePoint but most of it seems pretty straight forward but I've hit a problem and haven't been able to find a way around it so far.
I'm trying to set/enforce Service Level Agreements (SLA's) for different departments based on the department the task is assigned to. I was going to do this based on the workflow status that generates the task but am open to any other suggestions.
My workflow for requesting funds for an approved project goes through several stages (Project management validation, Finance Admin validation; Finance manager validation, Fixed assets authorization) and each one has a slightly different SLA. For this reason, I cant just add an arbitrary value to the start date for the calculated column associated to the task.
Any suggestions?
The option I'd go with here is to use If/Then blocks in my workflow code based on the current stage. Something like this:
If Stage = Project Management Validation Then
Set DueDate to Today+5
Else If Stage = Finance Admin Validation Then
Set DueDate to Today+3
etc.
Hope this helps!

Sitecore Publishing Problems and determining item state

Can anyone explain to me what state the data should be in for a healthy sitecore instance in each database?
for example:
We currently have an issue with publishing in a 2 server setup.
Our staging server hosts the SQL instance and the authoring / staging instance of sitecore.
We then have a second server to host just the production website for our corp site.
When I look in the master database the PublishQueue table is full of entries and the same table in the web database is empty.
Is this correct?
No amount of hitting publish buttons is changing that at the moment.
How do I determine what the state of an item is in both staging and production environments without having to write an application on top of the sitecore API which I really don't have time for?
This is a normal behavior for the Publish Queue of the Web Database to be blank. The reason is because changes are made on the Master database which will add an entry in the Publish Queue.
After publishing, the item will not be removed from the Publish Queue table. It is the job of the CleanupPublishQueue to cleanup the publish queue table.
In general, tables WILL be different between the two databases as they are used for different purposes. Your master database is generally connected to by authors and the publishing logic, while the web database is generally used as a holding place for the latest published version of content that should be visible.
In terms of debugging publishing, from the Sitecore desktop, you can swap between 'master' and 'web' databases in the lower right corner and use the Content Editor to examine any individual item. This is useful for spot checking individual items have been published successfully.
If an item is missing from 'web', or the wrong version is in 'web', you should examine the following:
Publishing Restrictions on the item: Is there a restriction applied to the item or version that prevents it from publishing at this time?
Workflow state: Is the item/version in the final approved workflow state? You can use the workbox to do a quick check for items needing approval.
Connection strings: Is your staging system connection strings setup to connect to the correct 'web' used by the production delivery server?
The Database table [PublishQueue] is a table where all save and other mutations are stored. This table is used by a Incremental Publish. Sitecore get all the items from the PublishQueue table that were modified more recently than the last incremental publish date. The PublishQueue tabel is not used by a full publish
So it is okay that this Table contain a lot of records on the Master. The web database has the same database scheme. (not the same data, web contain only one version of a item, optimize for performance) The PublishQueue on the web is Empty this is normal.
To Know the state of an item compair the master version with the web version, there can be more than 1 webdatabase, The master database do not know the state/version of the web database