Redmine - how to add an activity in the DDL - redmine

Wow.I cant believe how raw redmine setup is.
Anyway..I wanted to update a ticket in a project and there is a activity drop down list. I cant save unless I specify an activity. There is none. How do I populate a list of activities from the UI?
Thanks

Time tracking activities: The content of the drop-down are in the global setting, relevant redmine guide link.
By the way, if you don't enter anything for spent time and the related comment in the Log time section, you don't have to select an activity.

Related

Older Power-BI pbix using a SharePoint-Online list that has had new fields added

I have a Power-BI dashboard that was created some time ago. It’s data source is a SharePoint-Online list. Since the Power-BI dashboard was created, several new fields have been added to the SharePoint list. Now I am being asked to add a new page to the dashboard that reports on those new fields. However, I have not found a way to get the existing Power-BI list/dataset to show the new fields.
Refreshing the data does refresh the values, but refresh does not add the new fields.
I’ve spend the last 4 hours looking on the internet for a solution. The only thing I have been able to do so far is to attach the list again with a different name- the new fields DO show up when I do this. (I can’t just replace the older Power-BI list/dataset because there have been several calculated columns and measures added.)
I can work with this and create the report, but is this the only way? It doesn’t seem like it should be.
Any help would be appreciated! Thank you!
(I'm using Power BI April 2021 and Sharepoint Online)
So, it looks like there's no good answer to this issue. I found that adding another instance of the reference Sharepoint list, that included the new columns, did work (however inelegant). That seems to be the best direct answer for times when the older pbix file must continue to be used.
What I ended up doing, though, was to create a new separate pbix file which included the latest version of the Sharepoint List. This was the best solution for my organization since it will allow us to be more focused on the specific manufacturing processes involved.
Thanks to #Jon and #Alejandro for their efforts to help!
If you have access to PowerAutomate you could refresh the dataset creating a flow so that given a certain time (say, once or twice a day) the dataset gets refreshed with the new created items.
Otherwise if you are working with the service version of Power BI you can program a refresh of the dataset directly from the workspace going to the settings of the dataset. You would have to have a gateway set for that which could be in personal mode or not.
Also if you want to update the data in the service version you could do it manually too in the workspace.

Modifying PowerBIIntegration.Data fields from PowerBI without restarting PowerApp canvas from scratch

Am pulling multiple fields into PowerApps from PowerBI via the PowerBIIntegration.Data.(Value) connection.
I realised after designing the whole app that I was calling a 'Count' summary rather an 'Average' summary of the metric I wanted. Having modified this, I can no longer retrieve that data - the connection seems to be broken.
Previously on a separate occassion, I added a field in PowerBI after finishing the app, and again, I could not retrieve the field in PowerApps, I could only retrieve the initial fields I had added. It seems as if there is a snapshot of fields when you first make the app, and this can't be modified.
I don't want to rebuild the app a third time if I don't have to!
Is there a way to refresh this?
Yep, I have tried PowerBIIntegration.Refresh().
Thanks!
Solved this issue.
Publish the dashboard to the PowerBI service with the new fields.
Click the 3 dots in the top corner of the PowerApp and click edit.
This will take you to the browser/development space for the app.
New/updated/renamed fields are now included/reflected.

iCloud/CloudKit Dashboard Issues with Telemetry, Logs, Usage Display, Subscriptions, Notifications

Bad things happening on iCloud/Cloudkit Dashboard for my app's container right now. Below is a summary of the issues I am having:
Issue 1: Login to the Cloudkit Dashboard. Select iCloud.com.mycompany.myapp container. Select Production>Telemetry. Error Popup: "Unable to Load Telemetry Data. Please file a Radar and include the current URL"
Issue 2: Select Logs from top Dropdown. Select Historical. Tap Search Logs. Error: "Error Loading Logs. The Logs could not be queried. Please try again later."
Issue 3: Select Usage from top Dropdown. There is absolutely no usage in Monthly or Daily (which is inaccurate)
Issue 4: Select Data from top Dropdown. Select Subscriptions. Hit Fetch Subscriptions button. "There are no subscriptions in this database" displays (which is inaccurate).
Issue 5: Change notifications are not being sent for subscriptions that, when queried for within the app, DO exist.
Are these problems just something that I am experiencing, or are other people having similar issues with iCloud/Cloudkit? Is this related to the new UI? Is there anything I can try to do to debug or fix these issues or is this something that is simply out of my control?
This sounds to me like something unique to your account, or potentially a temporary degradation of service on Apple's part.
I would contact Developer Technical Support and see if they can help.
There is an all new design for the dfashboard. Try it again and see if your issue was the transformation to this new design.

How to get a list of all page-level APEX_ITEMS in the current page?

I have an Apex application that is quite large. The need has come up to store detailed usage logs of this application. The information on APEX_WORKSPACE_ACTIVITY_LOG is not enough, because I need to know what queries each user runs on each page.
My first thought was to get the actual Oracle query logs (V$SQL and such), but they provide no information on the user (as far as the database is concerned, all queries are made by APEX_PUBLIC_USER). I have some information about the user on V$ACTIVE_SESSION_HISTORY, but that's incomplete because it stores samples of active sessions and their SQL queries at 1-second intervals, so I miss too many queries.
So now I'm off to implementing application level logging. The "right" way to fo this would be to go through all the pages in my application and create a logging process to store the relevant information for each one (username and some page items). But I wonder if there might be something simpler that does the trick.
If I understand correcly, "application processes" are run by every page in the application. So if I can get an application process to iterate over the list of page items, I can store them all in the database and be done with it. Something like
for item in page_items {
log(username, item_name, item, date)
}
Can this be done? Or maybe the information I need is on the database already and I don't see it?
You can query the metadata tables to get all items for a specific page and then use that to get their value.
select item_name, v(item_name) item_value
from apex_application_page_items
where application_id = :APP_ID
and page_id = :APP_PAGE_ID;
That will capture all items on the page. Don't forget that if you use items on Page 0 (Global Page) you may want to query that page too.
Additionally, you may want to capture application level items too.
select item_name, v(item_name) item_value
from apex_application_items
where application_id = :APP_ID;

Set Sharepoint task due date based on workflow status

I'm new to SharePoint but most of it seems pretty straight forward but I've hit a problem and haven't been able to find a way around it so far.
I'm trying to set/enforce Service Level Agreements (SLA's) for different departments based on the department the task is assigned to. I was going to do this based on the workflow status that generates the task but am open to any other suggestions.
My workflow for requesting funds for an approved project goes through several stages (Project management validation, Finance Admin validation; Finance manager validation, Fixed assets authorization) and each one has a slightly different SLA. For this reason, I cant just add an arbitrary value to the start date for the calculated column associated to the task.
Any suggestions?
The option I'd go with here is to use If/Then blocks in my workflow code based on the current stage. Something like this:
If Stage = Project Management Validation Then
Set DueDate to Today+5
Else If Stage = Finance Admin Validation Then
Set DueDate to Today+3
etc.
Hope this helps!