I'm using Anemone gem in following way:
- Visit first url (seed), save page content to database and save all links from this page to database as well (all links which are not in database yet)
- Load next link from database, save its content and any other links again
- If there is no other link, crawl all links again (after some time period) to overwrite old content by new
This works pretty well but is there any possibility how to crawl pages which requires login (if I know username and password)? I know Mechanize gem which provide functionality to fill in forms but I don't know how to integrate it in my process (if it is possible). Or is there any other way how to crawl pages "behind" login form?