I ended up turning logger++ off because I saw a post suggesting it slowed things down. I did have logger++ running at the start and I could see that it seemed like it was making many requests to the same things over and over, like every page that had external jquery on it or google analytics was causing the same links to get hit thousands of times. I noticed that the time remaining would jump around, sometimes going down to an hour or so, then it would climb back up to 5h. Sometimes it would seem like the scanner was hung, making no requests for a long time and I would try pausing and restarting which seemed to bring it back to life, albeit still very slowly. I got on with some other stuff and let it do its thing in the back ground, checking on it occasionally. Sometimes it would just hang, the number of requests wouldn't go up for minutes at a time. Then suddenly it just jumped up to "5 hours remaining" and stayed there for a long time, and the speed of the crawl slowed down dramatically from about 2 requests per second to 1 every 5 seconds. The crawler got up to about 900 requests and 100ish unique locations with a reasonable amount of time remaining on the clock, I think it got down to 2 mins to go at one point. ![]() I launched a scan of the site using default crawl & audit policies on Friday. Traffic is traversing a VPN over a 50/20Mbps link but I can get max throughput normally with the VPN so no issues there. Burp is running on a VM on my laptop with 4 CPU cores and 8GB of RAM. I am testing a wordpress based site, it runs on a dedicated VPS. ![]() ![]() I am hoping some more experience users can give me some insight here. I am pretty new to Burp pro, so I don't have a lot of experience to know whether this is expected behavior or not.
0 Comments
Leave a Reply. |