Social Engineering

The Dark Reading asked a rhetoric question recently: “When Will End Users Stop Being Fooled By Online Scams?” Well, you probably guessed the answer right away and it is “never”. I do not think it is possible to train the whole population of the planet in the intricacies of security. So the social engineering attacks in all of their variety are here to stay.

From this point of view, the “training” you get early in life matters, I think, quite a lot. I would hazard a guess that people who tried various social engineering tactics on their environment when they were kids are less gullible as a result. So we should not be so hard on our kids when we catch them lying and trying to trick others. Yes, they should know it is not acceptable. But they also should know how it is all done and kind of come to expect this trickery so they can distinguish social engineering attempts directed at them easier. So, do not punish them so hard, better teach them how to do it in a harmless way.… -->

continue reading →

CakePHP: bind hasMany as hasOne

This is totally brilliant. I came across this marvel somewhere and adapted to my application. See, if you have a hasMany relation, you end up with (1) an extra query and (2) with a lot of data. I have a case where I just need the last (time of creation) row. So I basically want to bind that model in a hasOne relation where the one row is determined by an expression selecting a single row.… -->

continue reading →

Orwell’s rules in security

I came across the “six rules of English language” set forth by George Orwell in his essay “Politics and the English Language” in one of the posts on Jordan Bortz’s Software Architecture Blog. They are:

  1. Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
  2. Never use a long word where a short one will do.
  3. If it is possible to cut a word out, always cut it out.
  4. Never use the passive where you can use the active.
  5. Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
  6. Break any of these rules sooner than say anything outright barbarous.

These rules are absolutely essential for good system or application security. All too often we have the situation where the real target is to provide an insecure system and it is obfuscated by the use of this “political language”. To turn the words of Orwell to our subject, the great enemy of software security is insincerity. When there is a gap between one’s real and one’s declared aims, one does not get proper security.… -->

continue reading →

Double meaning not intended

Some people never read what they write. As a result, the following gem is available for us from CafeSoft:

” I can’t tell you how reliable your product has been.” – Cams SME Customer

Sure, he can’t tell how reliable the product was, because it wasn’t! :) People, do me a favour – read your pages, read your blurbs, read you own posts. Please.… -->

continue reading →

Scroogled

I just came across this wonderful piece: Scroogled by Cory Doctorow

“The courts won’t let them indiscriminately Google you. But after you’re in the system, it becomes a selective search. All legal. And once they start Googling you, they always find something. All your data is fed into a big hopper that checks for ‘suspicious patterns,’ using deviation from statistical norms to nail you.”

and I think it is a must read for all of us. And that’s besides it being an entertaining read.… -->

continue reading →

Biometrics is not for authentication, folks!

The capacity of people to persist in their delusions never seizes to amaze me.

A yet another researcher is wondering why biometric authentication does not work: “Ten to twenty per cent of utterances collected by voice biometrics systems are not strong identifiers of the individual that spoke them…”, says Dr. Clive Summerfield.

There is a serious problem with biometrics, and maybe this problem is not voiced sufficiently loud, since we have the same thing again and again. The problem is: biometric characteristics cannot be changed. Everybody knows that, right? The logical consequence of that is: the biometric data can be successfully used to identify a person but cannot be used to authenticate a person. Let me repeat that:

The biometric data can be used to identify but not to authenticate a person.

It works very well as a means of identifying someone and that is how we used it for so many years quite successfully (what do you think your picture in the passport is?) But in order to use it to authenticate a person, to be an authentication token, the person must be able to change it. Must be able to change the biometric data, period. There is no other way. And almost all research in biometrics rotates around this silly subject: how to change the immutable? After twenty years of this circus it should be obvious to everyone and their dog but no-o-o…

Biometric data is successfully used for identification for thousands of years precisely because it is difficult to change. And biometric data could never be used for authentication because it is so hard to change. It is that simple and still we have hundreds of people around the globe deny the obvious.

Here is a simple rule of thumb: if a “security specialist” talks about providing authentication based on biometric data – run for your life!… -->

continue reading →

FSF: Defend user freedom on tablets and smartphones

In December, Microsoft apparently conceded to public pressure by quietly updating the Windows 8 logo certification requirements with a mandate that a desktop computer user must be able to control (and disable) the Secure Boot feature on any Windows 8 computer that is not based on ARM technology. This looks like a victory for free software users, as it will allow a person to install GNU/Linux or other free software operating system in place of Windows 8.

But, this is no time for celebration, because Microsoft has also added a treacherous mandate for makers of ARM-based computers — such as a tablets, netbooks, and smartphones — requiring them to build their machines with Restricted Boot technology. Such computers are designed to lock a user into only being able to run Windows 8, absolutely preventing her from being able to install a free software operating system on her computer. Since smartphones and tablets are some of the most commonly used computers, it’s vital that we get straightforward and clear information about this threat out to the public.

Already know what this is about? Then take action now:

  • Raise awareness and have fun while putting pressure on Microsoft and computer makers by entering the Restricted Boot Webcomic Contest.
    • Winning submissions will be featured on the front page of fsf.org for a month.
    • Entries must be submitted by March 17th by emailing campaigns@fsf.org.
  • Sign the statement “Stand up for your freedom to install free software.”
    • For individuals
    • For organizations and corporations

If this is the first you’re hearing about this whole Restricted Boot vs. Secure Boot business, read the full story.

You can support this campaign and the rest of the FSF’s work by joining as a member or making a donation today.

Sincerely,

Josh, John, Matt, and Richard
Free Software Foundation

P.S. This is a verbatim copy of the FSF newsletter. I see no need to say it differently.… -->

continue reading →

RSA: 99.8% Security

The folks over at École Polytechnique of Lausanne have published a very interesting paper titled “Ron was wrong, Whit is right“. This is not too mathematical for a cryptanalitical paper and understandable even to someone without crypto background. It is more of an investigation into the properties of the public keys available publically on the internet. The guys explain how by collecting a large number of keys from the internet in very proper and official ways and analyzing them they were able to find collisions that basically allow one person to impersonate another not to mention some basically weak keys that offer no security at all. Fascinating stuff.

A cool comment is all the way at the bottom says:

“The lack of sophistication of our methods and findings make it hard for us to believe that what we have presented is new, in particular to agencies and parties that are known for their curiosity in such matters. It may shed new light on NIST’s 1991 decision to adopt DSA as digital signature standard as opposed to RSA, back then a “public controversy”.

Which is probably true, you know…… -->

continue reading →