Hacker News new | past | comments | ask | show | jobs | submit | megantic's comments login

After failing to fund an ubuntu phone, now they're trying their luck again with a not so cheap, not so good hardware (who wants an intel atom instead of arm?).

I use linux but I have come to really dislike ubuntu over time, so this is pointless to me, but I wish them luck.


I cringe when I read their claim to be "the first-ever privacy-focused browser built on chromium". Iron[1] was first released in 2008 to circumvent privacy issues with chrome.

I wonder if they did a really poor job at researching what already exists and are truly clueless about iron, or if they outright lied for marketing purposes. Hopefully they're not clueless about privacy and are not lying about features, though I would not bet my privacy on "hopefully" specially when epic browser website lacks an https version and epic browser bears a unique fingerprint on panopticlick.

[1] https://www.srware.net/en/software_srware_iron.php


SRware is focused on removing google stuff...not really privacy in general, at least from our view.


Ads. When you use the browser search engine. See the faq on their website.


Looks like an attempt at pivoting, jumping of the privacy bandwagon. Doesn't really make a good case that this browser is based on belief of privacy as stated in their faq.


1) we'll take another example, let's say you have your own religious views and those get collected in a database, seemingly no harm done here. Then comes a newly elected government with different religious views who decides people holding the same religious views as yours should wear a distinctive sign to warn the public, then to gather those people in camps, then starts mass killing those people on an industrial scale. This happened before, if it were to happen again in the future you would have no way of hiding facts about you as everything has been collected about you for years.


Wow....you just took web analytics and tracking and somehow morphed it into the holocaust. That's probably the biggest slippery slope argument I've ever seen in my life.


This is obviously an extreme example, but it's valid nonetheless. nazi germany had to build the database (thanks IBM) they needed, a future nazi, fascists or the like government wouldn't have to, those databases already exists in much more details nazi germany would have dreamed of (facebook seems to know you're gay before you do or your family does [1]).

But this nazi example is one everybody can relate to because we're all familiar with it. But if this is too strong we could go a bit further in history and talked about richelieu "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged." and the current state of us law [2].

[1]: http://americablog.com/2013/03/facebook-might-know-youre-gay... [2]: http://www.harveysilverglate.com/Books/ThreeFeloniesaDay.asp...


Are you saying that it is ok to globally disregard privacy, profile users, sell their data/profile without them knowing or getting a part of this profit because it doesn't make your life worse ?


I don't think I said anything about global disregard. I asked why I should personally care. It seems like there are a lot of people trying to tell me I should care, but I did not once say that others should not care.


Well the reasoning is quite simple actually, if you care for freedom then you should care for privacy. Without privacy there's no freedom possible.

Maybe read some of the many posts around about privacy and freedom and rebuttal of the "I don't do anything wrong so I don't have to hide" argument: you could start with the schneier's blog: https://www.schneier.com/blog/archives/2006/05/the_value_of_...

But let me try to make a point of why you should personally care even though you don't know or understand why. This is a case of closing the barn door after the horse is gone, if you later learn the hard way you should have cared and go the extra step of protecting your data, you couldn't go back and get your data back. And the sad reality is that if you have to learn this way, it means that history has indeed repeated itself again and you're enjoying living under a tyranny.


Well the reasoning is quite simple actually, if you care for freedom then you should care for privacy.

Sorry, but this sounds like the cr*p touted around by politicians and over zealous "patriots", along the lines of "if you don't support the war then you are not a patriot and therefore must support the terrorists".

Despite what most on HN would like to believe, outside of the tech community most people don't care about their privacy being invaded, and the OP is entitled to his opinion of not caring just as you (and I) are too overly caring. I believe that is the true definition of freedom, to be able to make ones own choice?


I've saved that article for reading later and I plan to respond when I've finished. Thanks for the link.


google tracking != anonymized search data


That's probably because you don't grasp what can be done with this data, and for how long this data will be around. The bottom line is in the relation between privacy and freedom.

There's a few point to raise here: how do you object or prevent the sale of your data ? how much of the money from the sale of your data went to your pockets ? what control do you have on your sold data over time ?

Then it's not only about you and your life, ever heard of first they came [1]?

[1] https://en.wikipedia.org/wiki/First_they_came


I'll ignore your belittling comment and ask you to clarify what exactly you suppose will be done with the data about my browsing habits. I'll return the sentiment, however, and suggest that maybe you don't grasp that there are people who do understand and are not quite as scared as you. I generally don't live my life in fear, regardless of what dangers loom around the corner, known or unknown. This is the second time you've correlated internet browsing tracking to Nazi Germany and the Holocaust. It amazes me that you even use the internet with such enormously exaggerated fears.

I'll go further to suggest that you are unaware that knowledge, intellect, understanding, and the capability to grasp these concepts is in no way correlated with susceptibility to fear and worry.


For sure, but in the case of google this probably doesn't apply.

From what was published recently we know NSA has proven methods for bypassing encryption, namely getting the keys used for encryption (so they can decrypt everything) or getting access to the content before encryption or after decryption.

To me this last move by google is a PR attempt at regaining people's trust


I'm so bored of hearing the accusations of PR stunts.

They crop up in every submission detailing an action taken by Google with regards to the Snowden/Prism/NSA revelations. Is it so ridiculous that a large corporation should seek to ameliorate its image in the eyes of users and shareholders?

PR has become such a dirty word.

Of course it would be best if all these actions were taken earlier, purely as the result of a strongly held principle. However, when presented with the realities of public businesses operating on a global scale - I am glad that such steps as those detailed above are taken: at whatever stage, and for whatever reason.

The tinfoil hat brigade needs to, as the old saying goes, "stop seeing reds under the beds" and occasionally ... just occasionally ... take the facts presented to them.

In times when misinformation and confusion is so wont to proliferate, attempting to discern true motive is almost ridiculous - condemnation on the basis of any such discernment doubly so.


When Google does something that makes it impossible for them to hand over certain types of data to the NSA, either by not collecting it, or making it so that only the user is able to decrypt it, wake me up. Until then, it's a PR stunt.


I am not disputing the fact that a major motivation for their actions is PR. I am suggesting that action as a result of PR pressure is still action - vastly preferable to meek acceptance of the status quo.

That being so - dismissing something as "just PR" misrepresents the actual benefits something like this may confer.


IMAP/POP3 has always been a gmail option, which allows local PGP use. Chrome sync allows you to set your own encryption passphrase (provided you trust the binary doing the encrypting...). You've been able to share encrypted files on google docs/drive since they added arbitrary file storage. Etc.

Chrome sync is probably the strongest example that I can think of fitting your criteria, since it's built into the product itself, but a lot of this just comes with the territory of web-based apps.


They haven't done anything there though... They've just provided a standard IMAP service, and a standard file syncing service...

When they provide an option in GMail for people to upload their public PGP keys, and then start encrypting email on the way in, and don't store any non-encrypted versions of those emails, and build PGP support into Chromium for accessing those emails. Then they will have done something worth noticing.


How would spam filtering or searching work in such a service?


Spam filtering:

  Step 1. Spam filter
  Step 2. Encrypt
Searching:

Client side tool which builds a local index as messages are decrypted to be read for the first time. The index is it's self encrypted and incrementally synced between clients.

That took me less than 5 seconds to think up. Google can spend time and money thinking up better solutions if they want to actually do something.


If there's any sort of processing on incoming data, then there's going to be a lot of unencrypted copies floating around in various caches and intermediate staging systems. A secure system requires encrypting the data right off the wire, before it's stored anywhere.

Search indexes are very large -- you don't want to double or triple the amount of storage your email client uses. Also, being able to search only mail that you've downloaded and decrypted is a terrible user experience. I'd estimate over 60% of the mail to my personal inbox is from some automated system, rather than directly from a human, and I typically don't look at them unless a search hits them.

It takes 5 seconds to think of solutions with terrible security and usability characteristics. Thinking of a system that will be a measurable improvement in security and will actually be used by people is much more difficult.


These are all easily solvable issues. But to get back to the point of this thread: Google has done nothing to help secure peoples email.

The fact that you can't identify any ways in which they could, or refuse to acknowledge them, or think they're too difficult for a multi-billion dollar company makes no difference to the point under discussion.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: