I have simple datasets of multiple tools. the problem is to predict the remaining useful life of the tools. In the excel sheet that I got, the data from multiple tools is contained. I'm new to weak and when I try to get support vector regression I cannot run the classification.
Related
I am using Snowflake as my backend database and created & published a dataset in Power BI with direct query. As a next step I am trying to analyze the data (to get Pivot experience) in excel.
I am observing the hierarchies I have created are not showing up in excel, though those are showing when accessing through PBI Service.
DirectQuery comes with a slew of limitations compared to imported datasets. The only hierarchy-specific limitation included in the official documentation is that Auto date-time hierarchies are not created for DQ datasets. However, this documentation is about direct limitations and doesn't specifically cover limitations that might only be applied to XMLA connections, which your connection from Excel is.
A workaround is to just use computed columns with the hierarchy values and name them like Category01, Category02, Category03 and do the nesting yourself. Users often have use cases that involve using hierarchies out of order (like grouping by Category03 THEN by Category01) and so consider it a feature rather than a flaw.
What procs are easy to learn and essential for SAS programming? I have learned several like proc print, sort, freq, format, univariate, anova, glm, import, transpose. What ones should I learn next?
Welcome to Stack Overflow (and SAS). The procedures that AlanC mention are all important.
Probably your best bet is to pick up a copy of The little SAS book and learn the data processing as well as the analysis procedures. I have used many versions of it for years and students like it. SAS changes at a glacial pace. So, if money is tight, pick up an older edition.
You have already hit many of the main procedures. Focus on data processing with data step and PROC SQL. SQL is its own language and is extremely useful with or without SAS. Also do not neglect ODS. SAS can make very beautiful output and the aesthetics matter when you are showing your portfolio.
If you want to be a professional SAS programmer you will need to learn macro, to automate tasks, and also the intermediate to advanced magic that Ron Cody writes about. Get comfortable with the language then work on converting your code into macro. Along the way be sure to check Cody's data cleaning book
We have a number of different business units each managing their own separate (but consistent) data sets in separate Excel spreadsheets. I've created a multi page pbix file that has queries looking at one of those spreadsheets and the users are happy with how it all looks.
What I'd like to be able to do, now the design is accepted, is to duplicate the existing pages and change the data source (on just the duplicate pages not all of them) to the other spreadsheets without having to rebuild all the graphs and apply all the formatting etc again from scratch.
Is this possible? and if not what would be the best approach, save as new pbix, change queries, then merge everything as a dashboard?
I'm relatively new to Power BI so still wrapping my head around how best to structure things.
thanks in advance!
After a bit of experimentation the simplest route I found was:
Added an extra column in the source data for Business Unit
Create queries for each Excel file
Create an append query pulling all the queries together
Built out the charts etc using the append query
Duplicated the page so there was one for each business unit
Then went back through each page and used a Page Level Filter using the Business Unit column to filter back to the required business unit
It definitely pays off to plan your structure in advance (if you can) as it saves a lot of rework!
OK, so I have a relatively complex report that works well on the desktop app but is bombing out on the web portal. Apparently, it is requesting 1048584KB which is just shy of the 1048576KB limit.
This report is a matrix, built as follows:
It is connected to two primary data sources, along with some tertiary feeds and helper tables. One of these is a sales detail table that is a CSV 887MB in size. The other is a purchasing detail table that is an XLS 26MB in size.
I have filtered out portions of the sales table (by date) in the Edit Queries screen. I have also filtered out specific item divisions in the matrix. It is the second step that was allowing this visual to function previously (took out a few not-needed divisions and it started working again, but now this no longer seems to work).
I would like to not just get a quick answer here, but also to better understand how Power BI is allocating the memory and how I can streamline. The rest of the report is using the same data but this is the only visual that fails to load (aside from some tables that are on a line-level and are intended to be filtered down via slicers prior to displaying information). Will add that there are some relatively complex measures that are firing on this visual and not used anywhere else, presuming this has a lot to do with memory demands...right?
Hi My dataset contains only quantitative data(numerical). It doesn't have any class attributes. The dataset contains with sales of different years. I need to analyze the data in different ways. Can I use WEKA for this analysis? I tried to use WEKA tool. But it seemed I cannot proceed with WKA unless I have class variables for the dataset. Please kindly give me a hint.