Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm working on a research project and am assigned to do a bit of data scraping and writing code in R that can help extract current temperature for a particular zip code from a site such as wunderground.com. Now this may be a bit of an abstract question but does anyone know how to do the following:
I can extract the current temperature of a particular zip code by doing this:
temps <- readLines("http://www.wunderground.com/q/zmw:20904.1.99999")
edit(temps)
temps //gives me the source code for the website where I can look at the line that contains the temperature
ldata <- temps[lnumber]
ldata
# then have a few gsub functions that basically extracts
# just the numerical data (57.8 for example) from that line of code
I have a cvs file that contains zip code of every city in the country and I have that imported in R. It is arranged in a table according to zip, city and state. My challenge now is to write a method (using java analogy here because I'm new to R) that basically extracts 6-7 consecutive zip codes (after a particular one specified) and runs the above code by modifying the link within the readLines function and putting in the respective zip code after the link segment zmw:XXXXX and running everything after that based on that link. Now I don't quite know how to extract the data from the table. Maybe with a for-loop function? But then I don't know how to use that to modify the link. I think that's where I'm really getting stuck on. I have a bit of Java background so I understand HOW to approach this problem, just not the knowledge of the syntax. I understand this is quite an abstract question as I didn't provide a lot of code but I just want to know they functions/syntax that will help me extract the data from the table and somehow use that to modify the link through a function rather than manually doing it.
So this is about the Weather Underground data.
You can download csv files from individual weather stations in wunderground, however you need to know the weather station identifier. Here is an example URL for a weather station in Kirkland, WA (KWAKIRKL8):
http://www.wunderground.com/weatherstation/WXDailyHistory.asp?ID=KWAKIRKL8&day=31&month=1&year=2014&graphspan=day&format=1
Here is some R code:
url <- 'http://www.wunderground.com/weatherstation/WXDailyHistory.asp?ID=KWAKIRKL8&day=31&month=1&year=2014&graphspan=day&format=1'
s <- getURL(url)
s <- gsub("<br>\n","",s)
wdf <- read.csv(con<-textConnection(s))
And here is a page with which you can manually find stations and their codes.
http://www.wunderground.com/wundermap/
Since you only need a few you can pick them out manually.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
Currently, my line of code is really long and I was curious to know if there was a more efficient way of doing this.
As Nick has pointed out your question is missing most of the information that would make it answerable. Please read more here, and add more information to your question.
In the meantime, a useful approach is to merge your zipcode data with a dataframe (or dataset) with the state-zipcode link in it.
* first you need to get the zipcode data from somewhere.
* Here is one way:
!wget "https://www2.census.gov/geo/docs/maps-data/data/rel/zcta_county_rel_10.txt"
* now put this data in a frame
frame create zctaFrame
frame zctaFrame{
import delimited "zcta_county_rel_10.txt"
}
* now I'm making up a dataset (share some of yours with dataex from ssc
input str10 name zip
"sam" 55901
"sasha" 84101
"saul" 84111
end
frlink 1:1 zip, frame(zctaFrame zcta5)
frget state, from(zctaFrame)
If this doesn't match what you're trying to do, please add more detail to the question.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
I am working on a Power BI report. There are two dimensions DimWorkedClass and DimWorkedService. (The above snippet is obtained by exporting matrix values to csv.)
The requirement is to transform only the Worked Service Text5 into the Worked Class of Text5 as opposed to A (which is the current value).
It can be transformed at the backend, but is there any way to do it in Power BI?
This is trickier than it might appear, but it looks like this question has already been answered here:
Power Query Transform a Column based on Another Column
In your case, the M code would look something like this:
= Table.FromRecords(Table.TransformRows(#"[Source or Previous Step Here]",
(here) => Record.TransformFields(here, {"Worked Class",
each if here[Worked Service] = "text5" then "text5" else here[Worked Class]})))
(In the above, here represents the current row.)
Another answer points out a slightly cleaner way of doing this:
= Table.ReplaceValue(#"[Source or Previous Step Here]",
each [Worked Class],
each if [Worked Service] = "text5" then "text5" else [Worked Class],
Replacer.ReplaceText,{"Worked Class"})
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
This is my first post on here, so please excuse any mistakes.
I have a column of cells. Each cell contains a variable number of lines withing the cell. Most lines contain a date. The format of the date varies slightly. Sometimes it is in the format MM/DD/YYYY, sometimes it will be MM/DD/YY, etc. My goal is to extract the date associated with a specific word in each line. Also, each cell is on a row with an identifying number. Therefore, I need the output to be along the same row.
Example:
I have tried every extract date formula I can find and I have run into three problems:
how to pull multiple dates from the cell,
how to compensate for the fact that some rows have dates that are formatted differently, and
how to pull dates only associated with certain words on the same line as the date.
It appears that my best option would be to use Regular Expressions. However, I have just started playing around with VBA and every function I have found that seems related to my issue I have been unable to adapt to my specific problem. I was using this post as a guide to build my function initially, but I cannot get it to work: Extracting Multiple Dates from a single cell
Originally, I tried breaking the lines up by doing text to column and this formula:
=IF(SEARCH("Red",D2),DATE(MID(D2,SEARCH("??/??/20??",D2)+6,4),MID(D2,SEARCH("??/??/20??",D2),2),MID(D2,SEARCH("??/??/20??",D2)+3,2)), "No Red Date")
However, text to column was not working because of irregular spacing issues. And Blue 1 and Blue 2 is just there to compensate for if there are multiple Blue dates in the cell, which there often are
NOT AN ANSWER : It doesn't really need code, you can use MID FIND SUBSTITUTE quickly playing, I used the following
=IF(FIND(C$1,$B2,1)-11<11,MID($B2,1,10),MID($B2,FIND(C$1,$B2,1)-11,10))
Which gives this,
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I've noticed twitter people search can come up with some weird results. Searching for match in screen_name twitter_name and bio is obvious, but they also do something different. I guess it has something to do with Triadic Closure but find its usage for search (instead of suggestions) weird. Wanted to hear your thoughts about this issue.
I think your question might be a little nonspecific, but here are my thoughts:
Suppose your search query was "Miley Cyrus", for instance. Now the top results will for sure include her real account, then fake ones, but then the results will get a little distorted.
I expect it ranks each account / person X in this manner (or something similar):
If person X follows accounts that has the search query in its bio / name, it has a higher rank than if that person didn't.
In our search, "Rock Mafia" is a good example; it doesn't have the term "Miley Cyrus" in its bio nor its name, but if you look at the people "Rock Mafia" is following, you'll find a lot of "similar" names / bios. Another ranking criteria would be this:
If person X has tweets that contains the search query in its content, it would also have a higher rank
A good example is the result "AnythingDisney" (#adljupdated), you can see that the 4th most recent tweet contains "Miley".
So basically the search prioritization looks like this:
Look in name / bio.
Need more results? Rank each person X by his followers and the people he follows, and by tweets that contain the query.
Need even more results? Look at "deeper" levels, rank each person X by the people being followed by the people X is following.
An so on, recursively.
I hope this helped in any manner!
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Where can I find some GPS unit test data to validate my code?
For example:
Distance between two coordinates (miles / kilometers)
Heading/bearing from Point A to Point B
Speed from Ponit A to Point B given a duration
Right now I'm using Google Earth to fumble around with this, but it would be nice to know I'm validating my calculations against something, well, valid.
"GPS unit test data" is quite vague. You could easily have a pile of data, but if you don't know what they represent, what value are the tests?
If you're looking for a math sample of latitude/longitude calculations, check out the example on Wikipedia's Great Circle distances article: http://en.wikipedia.org/wiki/Great-circle_distance#Worked_example It has two points and works the math to compute the distance between them.
Or are you looking for the data that comes directly from a GPS unit? These are called NMEA sentences. An NMEA sentence begins with $GP and the next 3 characters are the sentence code, followed by the sentence data. http://aprs.gids.nl/nmea/ has a list.
You could certainly Google for "sample nmea data". The magnalox site appears to have some downloadable sample files, but I didn't check them to see if they'd be useful to you.
A better option would probably be to record your own data. Connect your laptop to your GPS unit, set it up to capture the serial data being emitted from the GPS, set the GPS to record your track, and take it for a short test drive. You can then compare how you processed the captured data based on what you know from the stored track (and from your little drive.) You could even have a web cam record the screen of the GPS to record heading/bearing information that doesn't arrive in the sentences.
Use caution if screen scraping NMEA sentences from a web site. All valid NMEA sentences begin with a "$GP"
RandomProfile offers randomly generated valid NMEA sentences.