[Solved] Scraping data off site using 4 urls for one day using R
You can turn all the tables into a wide data frame with list operations: library(rvest) library(magrittr) library(dplyr) date <- 20130701 rng <- c(1:4) my_tabs <- lapply(rng, function(i) { url <- sprintf(“http://apims.doe.gov.my/apims/hourly%d.php?date=%s”, i, date) pg <- html(url) pg %>% html_nodes(“table”) %>% extract2(1) %>% html_table(header=TRUE) }) glimpse(plyr::join_all(my_tabs, by=colnames(my_tabs[[1]][1:2]))) ## Observations: 52 ## Variables: ## $ NEGERI / … Read more