How To Send Javascript And Cookies Enabled In Scrapy?
Solution 1:
You should try Splash JS engine with scrapyjs. Here is a example of how to set it up in your spider project:
SPLASH_URL = 'http://192.168.59.103:8050'
DOWNLOADER_MIDDLEWARES = {
'scrapyjs.SplashMiddleware': 725,
}
Scraping hub which is the same company behind Scrapy, has special instances to run your spiders with splash enabled.
Then yield SplashRequest
instead of Request
in your spider like this:
import scrapy
from scrapy_splash import SplashRequest
classMySpider(scrapy.Spider):
start_urls = ["http://example.com", "http://example.com/foo"]
defstart_requests(self):
for url in self.start_urls:
yield SplashRequest(url, self.parse,
endpoint='render.html',
args={'wait': 0.5},
)
defparse(self, response):
# response.body is a result of render.html call; it# contains HTML processed by a browser.# …
Solution 2:
AFAIK, there is no a universal solution. You have to debug the site, to see how it determines that Javascript is not supported/enabled by your client.
I don't think the server looks at X-JAVASCRIPT-ENABLED
header. Maybe there is a cookie set by Javascript when the page loads in a real javascript enabled browser? Maybe the server looks at user-agent
header?
See also this response.
Solution 3:
Scrapy doesn't support java script.
but
you can use some other library with Scrapy for executing JS , like Webkit, Selenium etc,
and you don't needs to enable cookies (COOKIES_ENABLED = True
), not even required to add DOWNLOADER_MIDDLEWARES
in your settings.py
because they are already available in default scrapy settings
Post a Comment for "How To Send Javascript And Cookies Enabled In Scrapy?"