Redmine Custom Query to return only specific columns - redmine

How to filter Redmine issues to only include specified columns, when using the REST API xml/json format?
I have a custom query to obtain specific columns of issues:
Filter set to "status(open)" and "Due Date(any)"
Options / Selected Columns set to "Due Date" (To only return the ID and the Due Date in the response)
These are two ways of accessing it:
A /issues?query_id=myqueryid
B /issues.xml?query_id=myqueryid
When i run A in my browser, the correct response is given, containing only the id and the due date. In contrast to B, now every field (ie. id, description, start date,...) is included in the response.
I also tried to add a "fields" value as some other apis suggest, but to no avail (ie. /issues.xml?issue_id=1337&fields=due_date,etc).

Redmine's REST API (i.e. the JSON and XML APIs) always returns all base fields of filtered issues. You can optionally include additional fields like watchers, journals, issue relations, ... by using the include mechanism describe in the API docs
When using the API, the client is supposed to fetch any fields it desires from the response on their own. The feature to show a tailored HTML table with selected columns as done in the web UI is considered not very useful for the API use-case right now.

Related

I can't read specific tables of Dynamics BC with Power BI

I am having troubles sincronising certain tables of our ERP (Dynamics Business Central) with Power BI. The steps that I am doing are explained below:
Get Data
Search Dynamics 365 Business central
Search for the relevant tables
This is when Power BI doesn´t let me preview the information within the table called 'salesCreditMemoLines' (), contrary to other tables that I can see without troubles ()
I appreciate your help in this issue.
This is expected error. Document lines collections in Business Central API require the respective document ID to be present in the request, otherwise it fails.
This is the piece of code from the API page that throws this error.
IdFilter := GetFilter(SystemId);
DocumentIdFilter := GetFilter("Document Id");
if (IdFilter = '') and (DocumentIdFilter = '') then
Error(IDOrDocumentIdShouldBeSpecifiedForLinesErr);
There are two ways to send the document ID. My examples below are querying sales orders, but the same applies to all document collections.
First is request the lines along with the document header using $expand syntax:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies({companyId})/salesOrders(orderId)$expand=salesOrderLines
Another option is to query the document lines adding the $filter parameter:
https://api.businesscentral.dynamics.com/v2.0/{{tenantid}}/{{environmentname}}/api/v2.0/companies(companyId)/salesOrderLines?$filter=documentId eq salesOrderId
Filters can include ranges, so this way it's possible to request a collection of lines from multiple documents.
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/webservices/use-filter-expressions-in-odata-uris
Neither of these methods is going to work in Power BI data source selection, though. An alternative way is to use an entity salesDocumentLines under the Web Services (legacy) option. Yes, it shows as legacy, but so far Microsoft has not announced any plans to remove the OData web services.
https://learn.microsoft.com/en-us/dynamics365-release-plan/2021wave1/smb/dynamics365-business-central/enable-power-bi-connector-work-business-central-apis-instead-web-services-only

Google Analytics - Regex for excluding sessions where Custom Dimension is populated in Custom Report

I'm building a custom report in GA UA to check which pages aren't passing specific custom dimensions. The Dimensions of the report are:
Hostname
Page
The metrics are:
Pageviews
Sessions
Users
What I'm expecting is to be able to add a filter with the following syntax to only get pages where the custom dimension isn't populating:
Exclude [Custom_dimension] Regex .*
When I do this however, the report excludes all sessions. If I change from exclude to include though, I only get the subset of sessions where it's populating, roughly 30% of sessions.
Is there a way to get the 70% of sessions not covered by [Custom_dimension] by changing the regex?
GA explicitly doesn't show rows in custom reports for any instances where any of the dimensions is not set. While it's definitely an odd solution on GA's part, you still can do a few things.
You can view, say, a default page report and drop a secondary hostname dimension there and then see where it's not set, or even try to filter those with advanced filter's regex.
Or you can export some of the data in BQ and inspect it there with no UI limitations.
Or you can export the data in an excel or google sheet to analyze it there. there are free options.

Update only specified fields of PrestaShop product via API

I would like to know if there is a way to update a ressource (product in this case), but only update fields that are specified in my request.
For example, let's say I want to update only the required fields for the product, but not change the others, how could I proceed via the webservice ?
As far as I know, when you update a ressource, it will update it as whole : This means if you don't "re-send" the original value back when updating (if you do not specify ALL the fields), it is considered as empty.
From my tests, I tried the following :
Log in my PrestaShop, get a random product, set the "width" to a random value (let's say 30 cm)
Retrieve the blank schema of a product, and fill only the required fields of the schema (width is not part of it)
Send the schema via the webservice to update the original product using a PUT request
Get back in my PrestaShop and notice in despair and sadness that the width value has been set to 0
My use case is that we have a system that is synching products with PrestaShop. When a product is edited in our system, a specific set of fields is sent back to PrestaShop so the product in the shop is also updated.
But for some of our users, they want to be able to add information on the shop and keep them even if the product is updated afterwards. For example, they add dimensions to the product (fields that are not managed / persisted in our system) and they want to keep those information.
The constraint we have is that the set of fields sent for updates is "hardcoded" : We can't get the ressource schema and update it to send it back afterwards.
Is there any parameter / configuration that can be set so values of fields that are not specified are not erased ?
Due to the nature of how the PrestaShop API works, you can perform the following:
For each product you want to update, you can pull the complete object (with all fields), then update that object with only the fields you need updated, and then push the whole object as n update for that product via the API.
Doing this for multiple products will require twice the API calls.. but this is one workaround.
One way is to use a flag variable for the properties which you want to update

RESTful Web Services from Entity Classes

I want to implement restful web-services to query my database.
In Netbeans I did this:
- I created entity classes from my db
- I generated web-services from these entity classes
GET methods work fine when testing but I have some additional requirements. I dont want to query only by tables' id-s. Data needs to be retrieved also when some other parametres are entered.
For example I have a table:
Customer: id, name, address, country
Now I want to display all customers from a specific country.
Where in code can I achieve that?
You can do this with slightly different urls.
So for a single customer you'd present a URL such as:
GET /customer/123.html
But for multiple customers, you'd figure out a way to specify groups. If you wanted all customers, you'd go for:
GET /customers.html
But say you grouped by country, you could try:
GET /customers/Australia.html
Using the singular or plural form would separate the two types of get-requests.

How to use CopyIntoItems to copy files into existing doclib items

This is my scenario: I need to copy files to a sharepoint document library using its web services and set metadata on them. That's all possible with CopyIntoItems (from Copy webservice) except for Lookup fields. CopyIntoItems ignores them, so i need another way to set data on those fields.
I've tried to create a list item with the mandatory and lookup fields metadata and then, using the item ID (creating a FieldInformation field with the ID, as well as some other simple metadata), called the CopyIntoItems method and, instead of updating the item, sharepoint created a new one.
I can't do this in the reverse order because i have no way to get the ID from the item created by CopyIntoItems...
So, the question is: How can i upload a file to a sharepoint document library and set all its metadata? Including Lookup fields.
Use a regular PUT WebRequest to to upload the document into the library
Query the document library to find the ID of the item you just uploaded (based on path)
Use the Lists.asmx web service to update the document metadata
Helpful link: Uploading files to the SharePoint Document Library and updating any metadata columns
Keep in mind that if the destination folder item count + the ancestor folders item count exceeds the list view threshold then you can't query the list for the id (step 2 from Kit's answer).
Queries can be done more efficiently if constrained to a particular branch in the folder hierarchy. A workaround would be to modify the site settings, but the queries would be sluggish and would make the solution less portable because the threshold for Office365 and BPOS can't be changed.
This explains it much better: http://office.microsoft.com/en-us/office365-sharepoint-online-enterprise-help/create-or-delete-a-folder-in-a-list-or-library-HA102771961.aspx