6

We currently monitor our webapps using cURL. More and more of our webapps use the GWT framework, which uses tons of JavaScript, and we can't rely on our cURL system to monitor anymore. Therefore, we search the right tool to monitor, but it seems difficult to find a crawler which is light (no Selenium please) but handles JavaScript correctly.

P.S. : we host our webapps as well as the probes, we don't want any Internet monitoring service.

4
  • 1
    Did you take a look at this thread stackoverflow.com/questions/2670082/… ? Commented Aug 27, 2012 at 16:10
  • Studying watir.com with headless mode Commented Aug 28, 2012 at 14:46
  • New idea : defining a REST API on each webapp (/probe/step1, /probe/step2, ...) Commented Oct 10, 2012 at 15:28
  • 1
    I think the right answer is CasperJS/PhantomJS. It works really well on GWT sites, it's a high level descriptive language with no reference to GWT RPC. Commented Nov 8, 2012 at 9:18

2 Answers 2

1

I would highly recommend investing time into PhantomJS or CasperJS.

They are browser simulations that interact with HTTP and DOM level JS. We use it at Top Hat to test our app and it's a very heavy JS client.

PhantomJS isn't for the feint of heart as getting it to work exactly like you want it to can be challenging however, it's worth the time.

Good luck, let me know how it goes @kentf.

0

Create a PHP page that downloads whatever is passed to $_GET["URL"] and saves it to temp.html. Then create an <iframe> with the source temp.html. Then use JavaScript to get the iframe's innerHTML. Pass that innerHTML to another PHP page that saves it as domain.html and have your monitor check out domain.html to get a look at the JavaScript compiled version of the site. You can just add a line to open the browser that starts this whole process and then check domain.html 30 seconds later.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.