Probably every programmer had a phase when he started to develop his own encryption algorithm. It was probably early in his professional life when he learnt about XOR and the magic it does. Most programmers soon after realize that they are not cryptographers and that their algorithm is shitty at the best. Those who don’t usually work on DRM later (and those things are never broken, are they?)
But it makes me wonder, are we approaching this all wrong? In a spy-happy world where NSA seems to influence security standards and where bulk decryption seems to be a reality, I would argue that own encryption has some benefits.
Since bulk collection relies on all data being in similar format, anything you can do to foil this actually makes you invisible. Let’s assume that AES is broken (don’t worry; it is not). Anyone relying on standard AES would be affected. But if some wise-ass just did XOR with 0xAA there is high probability that his data would skip the collection.
Mind you stupid encryption is still stupid. And if you are targeted by NSA there is high probability that they will get the data regardless of what you do. If you are using some homegrown encryption, it will be broken. However, they will be unable to take this data in an automatic manner. Enough people doing this would mean they need to dedicate human resources for every shitty algorithm out there. And you are probably not important enough to warrant such attention.
Probably smarter choice would be using two encryption algorithms, back to back. You can use Rijndael to encrpyt data once, then use another key (maybe derived via Tiger) with a Twofish. I am quite comfortable saying that this encryption will not be broken by any automatic means. System might have huge gaping holes, but it will require human to find them.
Of course, once you start doing your “twist” on encryption method you suddenly become completely incompatible with all other “twists” out there. Implementations will become slower (yep, double encrypting stuff costs). Implementing two encryption algorithms will not really protect you against targeted attach where e.g. trojan can get used to steal your password and circumvent all that encryption. Nobody will bother to do cryptoanalysis on your exact combination so you are pretty much flying in the dark. And probably another bad thing or two I forgot.
However, there is something attractive in rolling your own encryption using standardized cipher blocks for data you deem important (e.g. password storage). Not only that it is an interesting defense but it also gives you an enjoyment of doing something you know you shouldn’t.
PS: Never take cryptography advice from a random guy on the Internet.
After seven years of www.medo64.com I decided to follow fashion and drop www. I was pleasantly surprised how easy it was.
Since these pages are WordPress-based, first step was simply changing WordPress and Site address. Since I do want all www.medo64.com requests to be redirected, I decided to adjust .htaccess file. For any request with domain other than current one, it will do simple redirect:
Few days ago Google made announcement that Google Reader is going the way of dodo. Personally I was quite annoyed with this because it was my RSS reader of choice, but there is enough time to search for alternative.
There were few reactions to this decision but most of them stayed in realm of realism. And then came Dvorak with his suggestion of Google releasing source code to public domain. Regardless of wishful thinking, this is never going to happen.
Google Reader is an old product and I can guess that its backend it very much reliant on Google’s infrastructure. Releasing it into wild would require quite a lot of work to make internal dependencies go away. Company that is closing down a product will not spend their engineers time to prepare product for future competitor.
Nor should it. Nobody expected WordPerfect to become open source as it was dying. And that was an excellent program that had a lot of users. Why would expectation be different for any other program just because it is on web? And don’t give me that crap about being able to use last version indefinitely. When program dies everybody switches to something else regardless of how good the last version was.
Even if it went open source I don’t think it would stay the Reader we know. First it would lose its simplicity. Everybody would add a feature or two to solve their particular problem. Lean Reader would soon start to get some fat and it would go the way of Firefox - nice browser that got way too much configuration options with time.
Google Reader is dead. Face it.
PS: Yes, I think that Google is being a bit evil. But it is not their first time nor it will be the last.
I am an honest man. Not a perfect one but I try. Given choice and acceptable price I will always try to buy.
I already got pissed a few times before with my inability to buy content in Croatia. It wasn’t question of money. Content was just not available for my IP.
Since I am in States now I though that no further issues would arise - I have a money and I have a proper address. Armed with confidence I tried to rent a movie on Amazon. Guess what? I was denied opportunity to actually pay.
Amazon has policy of not allowing movie purchases with foreign credit cards. It does not really matter that I am actually living in US. It does not matter that I can use my credit card for other Amazon purchases. I don’t have an US credit card and that is enough to deny me a purchase.
And then movie industry wonders why people turn to pirates…
As year is now over, it is good time to pull out some statistics.
There was total of 117 new posts last year. Programming was biggest category with 35% and program update posts came second with 15%. Everything else was mishmash of various topic.
Traffic-wise I did good since number of visits almost doubled as did number of page views. Readers came mostly from US (24%) and Germany (10%). Since more than 60% of visitors actually don’t disclose location I can only assume that they follow same distribution.
A bit over 50% of visitors came looking for programs I have on site, VHD Attach and MagiWOL being most popular. Windows 8 proved to be quite a popular topic with something like 20% of visits. Unfortunately this is not because of their popularity but mostly because people were looking at troubleshooting posts.
Uptime this year was 99.87% as calculated by PingDom. How close this number is to reality I cannot be bothered to check. It does sound plausible in any case. And who can prove me wrong anyhow?
Everybody thought that it will be a comet, volcano or something equally noticeable but end came silently with a whiff of decomposing bodies. Yes, zombies are around us.
I hope everybody currently living will manage to survive. Good luck.
Starting December 6th 2012 there is no more free Google Apps. If someone wants Google to handle mail for his domain, he can now expect hefty yearly charge of $50 per user.
All current users of free Google Apps should probably start searching for new home since I don’t expect this status quo to last. My assumption is that existing users will start getting “pay up or get lost” mails pretty soon. For family of four this means $200 if you want to continue using it or dealing with account migration. Lets hope that Google’s “Don’t be evil” motto will prevent this extortion-style scenario from happening.
This unfortunate turn of events just comes to show how living in cloud leaves you in mercy of your hosting company. There is no such things as a free lunch.
If you are wondering what ITU can offer if it takes over control of Internet, wonder no more. Great example of moronic and damaging decisions is already here - Y.2770 Requirements for deep packet inspection in Next Generation Networks.
This standardizes the way how government can spy any traffic that goes over your favorite telco. If possible this standard also keeps door open to decrypt user’s traffic “in case of a local availability of the used encryption key(s).”
Whoever thinks that this is good idea and trusts his government is complete moron.
My own government was caught some time ago (translation) just adding unrelated phone numbers to existing warrants. End result is that this practice was deemed quite appropriate. USA government is no better with its whole NSA activities. Lets not even go into areas here warrant are not required. And there is plenty of other governments with even lower standards.
This standardization will allow them to use one kind of equipment in each telco thus reducing their cost of tracking a citizen. Forcing a company to give them SSL keys in order to stop users from encrypting traffic is next step. Hell, I can bet that some countries are already doing that.
Quite a lot of Internet security relies on encrypted things staying encrypted. Once that encryption is taken away all you think of private is not so private anymore. Each time you access your bank in order to make a transaction, someone else is peeking over your shoulders.
And “honest people have nothing to hide” is bullshit. How much time do you think it will pass until some security hole is discovered on those systems?
Same standardization that makes live easy for government will also make it a sweet delight for criminals. Just filter through minute of traffic and you will have enough information to get into thousands of accounts. I will not even cover what single rogue agent can do e.g. if he finds that his wife is cheating him. He can gather/falsify enough information to get other guy in real trouble. And I am being optimistic, real-life scenarios will be much darker.
And this is a mess that ITU is already capable of. Imagine ideas that will spring into their mind if they gain control over Internet. Orwell was right, he just missed the year.
This December International Telecommunication Union is asking for keys to the Internet. Finally there will be central body to control and help expand the network we all use.
And I am scared.
When I look the trend of various countries locking down on Internet privacy and usability in the name of “security” I can just imagine ITU making it mandatory for all countries to implement switch in order to prevent cyber attacks. And then some content filtering of naughty material “for the sake of children”. One tiny step removed I can see countries suppressing criticism and all content that they do not care about.
I do not trust my (or any other government for that matter) in making wise choice. Government will by design try to keep it self in power regardless of what it takes. And each government having same input in ITU is just recipe to allow for Great wall of [insert country name here].
Huge concern is also that ITU is actually quite telco-heavy. That means that all decisions would be very much influenced by what powerful telcom operators think about it. And I can just see how they would like to limit (already low) speeds in order not to invest in their network anymore. Telcos do not care about people, nor should they. Telcos are corporations and they only care about profit for their shareholders.
Yes, currently USA has all keys to the Internet and situation is not ideal. However, even with this there is lot of freedom left to engineers to do what they need. And “freedom talk” that they are all fond of does keep most of censorship and death switches away for now. Having someone like China determine how open Internet will be is not a thought I cherish.
Maybe I am just paranoid, but I cannot see how this can end up well.
Having files in cloud had became a norm. I myself have DropBox, SugarSync, Google Drive and SkyDrive accounts. Every day I rely on them to get my files and synchronize them across my devices.
It gets even worse on tablets. Most of them, regardless whether they are Android/iOS/Windows, rely on cloud storage quite a bit. And why not? Cloud services have became so reliable from technical point of view that any data loss is highly improbable. Solution that requires no backup is finally there.
Unfortunately all those technical accomplishments are irrelevant. TechDirt brings quite a few stories of cloud data loss. And it is not because of technical glitch but because of false copyright claims.
Copyright holders have been quite active in auto-detecting infringing files with most (if not all) cloud providers. If they find unauthorized work they delete it and you get one strike. Few more illegal files later and you might find yourself with blocked account. All your files are gone.
Annoying thing about copyright claims is their “shoot first, ask questions later” approach. It does not matter whether file is really infringing. All that matters is that some automatic bot thinks it does. And than you need to jump through hoops proving your innocence. Even with best efforts and in case of obvious errors that can take days. While all your data is held hostage.
Cloud services are integral part of life these days and almost every computer has some data up in the sky. But do not think that your data is safe just because someone else takes care of it.