8

I'm working on a fairly large web site built in PHP that will potentially have a lot of users. I'm looking into a way to protect the login screen from automated attempts. I have already included a CAPTCHA check on the registration form, yet want to harden the site more.

There have been similar questions on StackOverflow that I know of, and I know I'm capable of implementing this myself from scratch (storing login attempts and their time in the db), yet I dislike that path:

  • Conceptually, I think this kind of logic belongs at the web server/infrastructure level, not the application level. I dislike having this logic and complexity in my application
  • I worry about performance, particularly at the database level.
  • I'm lazy, in a good way, by not wanting to build a common utility like this from scratch

Any advise is appreciated, I think that I'm particularly looking for some kind of Apache module that can do this. My platform is PHP5 (using CodeIgniter), Apache2, MySQL 5.

Fer
  • 4,116
  • 16
  • 59
  • 102

3 Answers3

18

update: do not use sleep() for rate limiting! this doesn't make sense at all. i don't have a better solution on hand.


a good start would be to just sleep(1); after a failed login attempt - easy to implement, almost bug-free.

1 second isn't much for a human (especially because login attempts by humans don't fail to often), but 1sec/try brute-force ... sloooow! dictionary attacks may be another problem, but it's in the same domain.

if the attacker starts too may connections to circumvent this, you deal with a kind of DOS-attack. problem solved (but now you've got another problem).

some stuff you should consider:

  • if you lock accounts soley on a per IP basis, there may be problems with private networks.
  • if you lock accounts soley on a username basis, denial-of-service attacks agains known usernames would be possible
  • locking on a IP/username basis (where username is the one attacked) could work better

my suggestion: complete locking is not desireable (DOS), so a better alternative would be: count the login attempts for a certain username from a unique IP. you could do this with a simple table failed_logins: IP/username/failed_attempts

if the login fails, wait(failed_attempts); seconds. every xx minutes, run a cron script that decreases failed_logins:failed_attempts by one.

sorry, i can't provide a premade solution, but this should be trivial to implement.

okay, okay. here's the pseudocode:

<?php
$login_success = tryToLogIn($username, $password);

if (!$login_success) {
    // some kind of unique hash
    $ipusr = getUserIP() . $username;

    DB:update('INSERT INTO failed_logins (ip_usr, failed_attempts) VALUES (:ipusr, 1) ON DUPLICATE KEY UPDATE failed_logins SET failed_attempts = failed_attempts+1 WHERE ip_usr=:ipusr', array((':ipusr' => $ipusr));

    $failed_attempts = DB:selectCell('SELECT failed_attempts WHERE ip_usr=:ipusr', array(':ipusr' => $ipusr));

    sleep($failed_attempts);
    redirect('/login', array('errorMessage' => 'login-fail! ur doin it rong!'));
}
?>

disclaimer: this may not work in certain regions. last thing i heard was that in asia there's a whole country NATed (also, they all know kung-fu).

stefs
  • 18,341
  • 6
  • 40
  • 47
  • nice and simple solution I agree, but it still requires database interaction. What if I simply add the 1 sec delay between the failed login detection and the re-rendering of the login screen? This would force both bots and humans to have to wait for the delay. For a human it is not much, and few humans will do a wrong login, whilst for a bot it is quite long. What do you think? – Fer May 20 '09 at 13:17
  • that doesn't really help, because i can do N concurrent requests. the 1 second block happens on a per-request basis, so the N concurrent requests finish after a bit more than 1 second (given the server can handle the load). so in principle, i could launch 10.000 requests and have them finishing after some seconds, because they aren't serial. – stefs Jun 04 '09 at 14:00
  • also, there are > 80.000 seconds a day. 80.000 isn't much for brute force, but it's probably enough for a dictionary attack. – stefs Jun 04 '09 at 14:05
  • good answer but I completely disagree with "because login attempts by humans don't fail to often"... they fail ALL the time, more than you imagine – Juan Ignacio Feb 23 '12 at 14:10
  • @juan: not often ... compared to bots! if i forget my password or if i'm too drunk to hit the right keys i'll try my password maybe up to 10 times before giving up. 20 times max, before requesting a new one. a brute force or dictionary attack would generate hundreds of failed login attempts for a single user in a very short amount of time - seconds or minutes - because with a human-reasonable fail quote the chance of succeeding in reasonable time would be way too low. `cat /usr/share/dict/words | wc -l` gives me 234936 (english words). – stefs Feb 23 '12 at 19:31
  • 1
    Please avoid sleep, This will make it easier for attacker to harm your service, if you do sleep for 5 seconds, all script will sleep even for other users. sleep will affect the script it self regardless the current request initiated the sleep. I already tested that. – Firas Abd Alrahman Jun 19 '17 at 22:26
2

A very dummy untested example, but I think, you will find here the main idea ).

if ($unlockTime && (time() > $unlockTime))
{
    query("UPDATE users SET login_attempts = 0, unlocktime = 0 ... ");
}
else
{
   die ('Your account is temporary locked. Reason: too much wrong login attempts.');
}
if (!$logged_in)
{
    $loginAttempts++;
    $unlocktime = 0;
    if ($loginAttempts > MAX_LOGIN_ATTEMPTS) 
    {
        $unlockTime = time() + LOCK_TIMEOUT;
    }
    query("UPDATE users SET login_attempts = $loginAttempts, unlocktime = $unlocktime ... ");
}

Sorry for the mistakes - I wrote it in some seconds ad didn't test... The same you can do by IP, by nickname, by session_id etc...

Jet
  • 1,171
  • 6
  • 8
-3

Why don't you wait with "hardening" and "scaling" your app until you actually have that problem? Most likely scenario is that the app will never have "a lot of users". This sounds like premature optimization to me, something to avoid.

  • Once you get bots abusing, harden the signup. I'd actually remove the captcha until you start getting > 1000 signups/day.
  • Once you get performance problems, improve performance by fixing real bottlenecks.
PeterV
  • 2,792
  • 3
  • 20
  • 22
  • 3
    Whilst you may be right conceptually, you do not know the background of my project and cannot conclude that it will likely never get these problems. I consider these protections best practices no matter the size of the website. If I do not protect against login attempts, passwords can be stolen. How is it premature to protect against that? The same goes for Captchas. I am using my own custom blog software, very much a niche. I only have 300 readers and still bots attack it with comment spam. A Captcha is a must-have in my experience. – Fer May 20 '09 at 13:21
  • 5
    Why childproof the cutlery drawer? why not just wait until the child starts playing with knives? – Twifty Jul 24 '14 at 05:11
  • This is horrible for obvious reasons. – S. Saad Nov 14 '17 at 18:39