Uh oh. For a long time I've been giving myself the excuse that the only reason why I keep using Gmail is security - Google has never had these kind of breaches.
The argument is no longer valid, time to move off Gmail.
If I understood the follow up blogpost right, it states that there is a lot of email adresses where the only hit is the email domain so they are filtering away that as false positives. Not all stolen credentials are properly aligned and encoded with ; on the correct place I guess :-)
There might be a lot less gmail adresses showing up as pwned now.
Great, I have a question for you if you can be bothered.
I recently "bought" a .com domain. It's not a popular English word, it's a small number of letters that match some truncation of my names. No competition, nobody wants it. Anyway, I looked at a handful of registrars and got the cheapest. But then I was a bit concerned. Why are they the cheapest, are they cutting corners somehow? The difference was substancial. GoDaddy was one of the most expensive ones are $23/yr and I bought it from dynadot for something like $6/yr. Why such a huge difference?
It's almost impossible to judge based on the pricing alone. Paying top dollar doesn't guarantee anything extra and low prices don't always mean low quality service.
$6.99 looks like a first year discount at dynadot with renewals at $10.88 USD.
> Why such a huge difference?
A lot of registrants don't know what the wholesale pricing looks like and the difference between $10 or $25 / year for a business isn't a factor. The risk that comes along with transferring to a cheaper registrar isn't worth it to save $15 / year.
“Then there is the % address operator: user %domainB@domainA is first sent to domainA, which expands the rightmost (in this case, the only) percent sign to an @ sign. The address is now user@domainB, and the mailer happily forwards your message to domainB, which delivers it to user. This type of address is sometimes referred to as “Ye Olde ARPAnet Kludge,” and its use is discouraged“
I would guess it's an anti-spam measure. Although if I'm reading sibling comment right, it is actually a valid email address? (Assuming you have a mail server running on localhost.)
The respective mailserver likely checks which domains it is forwarding mails to, e.g. only allowing netbsd.org, or only allowing mails from localhost. In the more distant past, that wasn’t the case, so spammers would send their mails to the domain of such mail servers, who would blindly forward it to whatever domain is encoded after the percent sign in the local part. They’d effectively serve as an open mail relay then.
Even someone who knows that git isn't GitHub might not be aware that ssh is enough to use git remotely. That's actually the case for me! I'm a HUGE fan of git, I mildly dislike GitHub, and I never knew that ssh was enough to push to a remote repo. Like, how does it even work, I don't need a server? I suspect this is due to my poor understanding of ssh, not my poor understand of git.
You do, an SSH server needs to be running on the remote if you want to ssh into it, using your ssh client - the `ssh` command on your laptop. It's just not a http server is all.
You start that server using the `sshd` [systemd] service. On VPSs it's enabled by default.
Git supports both http and ssh as the "transport method". So, you can use either. Browsers OTOH only support http.
Edit: hey this is really exciting. For a long time one of the reasons I've loved git (not GitHub) is the elegance of being a piece of software which is decentralized and actually works well. But I'd never actually used the decentralized aspect of it, I've always had a local repo and then defaulted to use GitHub, bitbucket or whatever instead, because I always thought I'd need to install some "git daemon" in order to achieve this and I couldn't be bothered. But now, this is so much more powerful. Linus Torvalds best programmer alive, change my mind.
BTW, a nice example of this general concept is Emacs' TRAMP mode. This is a mode where you can open and manipulate files (and other things) on remote systems simply by typing a remote path in Emacs. Emacs will then simply run ssh/scp to expose or modify the contents of those files, and of course to run any required commands, such as deleting a file.
Git is distributed, meaning every copy is isolated and does not depends on other's copy. Adding remotes to an instance is mostly giving a name to an URL(URI?) for the fetch, pull, push operation, which exchange commits. As Commits are immutable and forms a chain, it's easy to know when two nodes diverge and conflict resolution can take place.
From the git-fetch(1) manual page:
> Git supports ssh, git, http, and https protocols (in addition, ftp and ftps can be used for fetching, but this is inefficient and deprecated; do not use them).
You only need access to the other node repo information. There's no server. You can also use a simple path and store the other repo on drive.
Doable. It would basically be ssh but without encryption. You'd have to bang out login by hand, but "username\npassword\n" will probably work, might need a sleep inbetween, and of course you'll have to detect successful login too. Oh, and every 0xff byte will have to be escaped with another 0xff
At that point, may as well support raw serial too.
Supporting rlogin on the other hand is probably as simple as GIT_SSH=rlogin
The server is git itself, it contains two commands called git-receive-pack and git-upload-pack that it starts through ssh and communicate through stdin/out
You can think of the ssh://server/folder as a normal /folder. ssh provides auth and encryption to a remote hosted folder but you can forget about it for the purpose of understanding the nature of git model.
SSH is just a transport to get access to the repo information. The particular implementation does not matter (I think). Its configuration is orthogonal to git.
I mean, you do need an ssh server. Basically ssh can run commands on the remote machine. Most commonly the command would be a shell, but it can also be git commands.
> Asked about the leaked document, Amazon spokesperson Margaret Callahan described it as “obsolete” and said it “completely misrepresents Amazon’s current water usage strategy”.
> “A document’s existence doesn’t guarantee its accuracy or finality,” she said. “Meetings often reshape documents or reveal flawed findings or claims.”
Here's a person trained to speak in half truths. Am I the only one to find this revolting? Please tell me I'm not alone.
Maybe, but it is entirely possible that someone put together some numbers and got them completely wrong, in which case those statements sound pretty reasonable. They could be misleading or outright false, but we don't know that either way based on those statements.
The claims made by the document referred to in the article are potentially harmful to Amazon. If they were untrue and the truth painted Amazon in a better light, they would likely be willing to counter the article with information of their own.
That they instead respond with a total non-answer is a signal that either the document is accurate, or the truth is worse.
Not grandfather commenter... but why yt-dlp? NewPipe is a one-stop-shop for your mobile device, yt-dlp means searching for the video on the browser and copying the URL, running yt-dlp on the computer and then transferring the downloaded video onto your phone...
With NewPipe you can be at the airport waiting for your flight, do it all on the device, and then hand it over to your kid (I'll leave out the commentary on using video to babysit them...)
Right. I guess where we differ is that I actively try to avoid spending time on my mobile. I have never watched a YouTube video on my phone in my entire life, and won't.
I'm not the person you asked, but I like it because: Easier to initiate downloads from mobile, got a nice ui for quality selection, can import subscriptions, keeps a download log that makes it easy to play recent download
Regulators could say "you're not allowed to make more than X profit". They already do that with utilities, so it's not a matter of practical impossibility.
I continue to believe that in the case of oversized margins, the government should just enter the market themselves. Buy the smallest competitor and operate it at a reasonable margin, growing it at every opportunity. If the rest of the market lowers their margins to beat it, spin the thing off.
Basically don't bother to dictate margins, just declare that market a failure.
The problem with this is it ends up being a signal in of itself, so when you say, the cap is X you end up having everyone immediately set their profits to X and never budge from there
reply