Hacker News new | past | comments | ask | show | jobs | submit | PaulWaldman's comments login

This article references "custom code." What about custom applications?

Are all government contractors required to provide the source code for all developed applications? Or does this bill only apply to contracts where the deliverables actually include source code?


It will be interesting to see the durability of print vs digital content of time.

Many web properties are no longer accessible due to M&A activity and Small/solo publishers unable or unwilling to maintain their assets. Archives like WayBack Machine mitigates some of the loss of digital content so long as the archives themselves are still maintained.

Will spinning rust be as durable as Microfiche?


> Will spinning rust be as durable as Microfiche?

Not sure how long microfiche lasts for but someone posted a link here not too long ago about how record companies had embraced magnetic hard drives in the 1990s to store music masters and are starting to find that the drives are no longer readable.


It depends a lot on the humidity and heat or light in the environment where the microfiche are being stored. But they should be able to retain their data for 500 or so years.

CDs and Laserdiscs are also seeing bitrot. The layer of material that is etched does degrade over time. Error correction helps some, but if it's a writable CD or DVD it's only likely to last a decade or two. M-Drives are CDs that are designed to retain their data for about 1000 years and can be writable by specific consumer drives. Not sure how long the professionally pressed CDs last but it's not that long.


Googling from your comment led to M-Discs, which are available in dvd or blu ray, up to 100gb discs. That looks extremely useful.


ah, thanks for catching the typo, it was getting late for me, I should have pulled up a link or something because I haven't worked with these discs in a decade or so..

yeah those are the ones I'm referring to -- if you're archiving something like family history or data that needs to be good for centuries (without having to re-copy and juggle), those are a better choice than just about anything else.

https://en.wikipedia.org/wiki/M-DISC


What beats M-Disc? Genuinely curious just having bought one.


Nothing comes to mind that you can interface with a computer, but when I wrote the phrase I was thinking of projects on the scale of Long Now [0], requiring physical etching on materials and very careful storage.

Alternatively, tell people that they can't store something and you're likely to find it robustly mirrored by many.

[0] https://longnow.org/ideas/very-long-term-backup/


Well that was a fascinating diversion. This is bonkers!

https://norsam.com/products/buddhist-nano-film/


We can all help in a small way. Archive.org is a non-profit and always needs financial support.

https://archive.org/donate


As photography was largely switching to digital, I sometimes wondered whether--whatever the preservation possibilities that digital offered--to what degree photos would really be preserved in practice relative to prints and slides.


Most photos are terrible. Colors can start fading in at little as 10 years if they were hanging on your wall that long. B&W can last longer, but still will fade. Of course there are different process, if you use the best process photos will last longer, but still they are not very stable.

Digital makes it cheap and easy to have multiple in many locations. While any one media may fail, you still have a copy - I have on this computer all the data from whatever computer I was using 15 years ago. (most of it I have not looked at in 20 years and I could safely delete, but it is still here, and on other backup systems I have)


Kodachrome is an amazing archival color film that when stored properly will last centuries. b&w negative film is even more stable.

You make a good point about the lack of durability and instability of many types of chemical photo processes (especially color negative and print processing). I do think many digital formats will be lost to time when a color transparency or b&w negative will still be viewable without much aid into the future.

One of my favorite photo books is the re-photographic survey project by Mark Klett. He went around re-capturing the exact locations (and camera position) of notable images of the American West from the early days of the US geological survey when they had a plate photographer on the team. We are talking about a time period just after the US Civil War. So we see a landscape captured in time 10 decades or more after the original.

I've been a pro photographer for over 30 years. All my earliest digital work is archived in RAW so I have the original shooting data. It all triple backed up and I have a friend that allows me store one of my backups at his home. I've been amazed at how many photographers lost track of or throw away their older work. I'm still licensing my work hundreds of times a year and some of this older material is becoming even more valuable simply due to scarcity. The redundancy of digital is great of you take archiving seriously.

Yet, I still have drawers of original film from the late 80's - to early 2000's I'm scanning a few but will probably let many be disposed of . . .


My point was there's the capability to do all this backup preservation but it doesn't just happen. And it's less visible in many cases than the proverbial shoebox full of photos will be.


What is the difference between photos on a crashed harddrive, and photos in a shoebox that that just burned in a house fire? Photos are vulnerable to many different attacks just like digital data.

These days your photos are probably backed up by facebook, google, or are such major players. (there are a lot of privacy concerns with the above, but they do tend to have good backups)


There is a lot of serendipitous backing up with social media. There was also a lot of serendipitous passing on to relatives of physical media. Not sure which better stands the test of time. (And I'm sure it varies.)


Often passing on to relatives is done with the only copy (well you retain the negative). School pictures come in packages of many, but otherwise you typically only print one copy.


Anecdotally, using the Gemini App with "Gemini Advanced 2.0 Flash Experimental", the response quality is ignorantly improved and faster at some basic Python and C# generation.


> ignorantly improved

autocorrect of "significantly improved"?


Eh, Tesls's FSD and Autopilot as well as GM's Supercruise are all classified as SAEJ3016 Automation Level 2.


Which tells approximately nothing about their relative capabilities. The fallback system is the driver (who needs to be ready to take over immediately) and the range of conditions for autonomous operation is better than Level 1.


> the range of conditions for autonomous operation is better than Level 1.

It better be, because Level 1 is satisfied with just cruise control with automatic braking.


> I've come to notice that there is no incentive for 3rd party ev-chargers to be dependable.

I don't know, how often would you stop at a gas station if their pumps weren't reliable? Many 3rd party chargers are selling electricity at a mark-up.


How many broken gas stations could you build if the government gave grants for building gas stations, but not for keeping them running?


The Federal US subsidy for EV chargers require 97% uptime across the year

https://www.federalregister.gov/documents/2023/02/28/2023-03...


Unfortunately, the lost income is marginal. The risk for Tesla/Rivian, or even a gas station, is more existential.


Yeah, the forging of EA from the Dieselgate money probably has a lot to do with why they don’t feel the heat on unreliable chargers.

I suppose it would be different if chargers were run by electric utilities or were there to goose the sales of convenience stores.


I wasn't aware that EA was owned by VW, I thought it was a separate business. Suddenly all the complaints from friends with EVs make tons more sense.

From wikipedia: (Electrify America) is a subsidiary of Volkswagen Group of America, established in late 2016 by the automaker as part of its efforts to offset emissions in the wake of the Volkswagen emissions scandal. Volkswagen, as part of its settlement following the "Dieselgate" emissions scandal, invested $2 billion in creating Electrify America.

https://en.m.wikipedia.org/wiki/Electrify_America


That's true, but there's also another weird counteracting effect. Lets say you have 4 chargers offering a full 350 kw each. If they get used at full power for ~15 minutes in the month, the demand charge for the month might be ~$10k (probably will be higher, tbh). This will be true even if the stations themselves see ~15% utilization.

But break two of them? Yeah, there might be an occasional line, but your fee drops to $5k and you still produce the same revenue. TBH, some of these stations likely have better ROI when a stall or two are broken.

High utilization sites are completely different, of course.


The stations have all kinds of strategies for avoiding these max demand charges. They can slow down the charge rate to save money for them, but there are lots of places that have local battery storage to reduce the max demand charge likelihood.


You can sell something at a markup but a gas station sells so many $50 fill ups per day compared to an EV station, the gas station probably has about 1000X more revenue.

Sure their margin might be better than a razor-thin gas station margin but for the time being it's a miserable business. Unless there's another incentive, like selling cars.

So yeah. Like the poster said, there is no incentive for 3rd party EV chargers to be dependable.


Has VMware won any significant customers since Broadcom's acquisition?


Have they tried to win any customers?


This still feels a long way off. I have yet to encounter a charger, not made by Tesla, that has an NACS plug.


Everyone in the US has committed to NACS in the next couple years. The others are aware of this and if they are not completely stupid making plans. They will probably support CCS for a while in some form, but they will be doing NACS in the near future. It might be like regular/premium/diesel fuels - pumps support more than one hose (though wire is more expensive than a hose).


> They will probably support CCS for a while

NACS is CCS with a different plug on the end. Tesla's charging standard is to die off, CCS will be the standard going forward.

Here's a real world demonstration of a charger with J3400 plugs (aka Tesla's plug):

https://www.youtube.com/watch?v=Y3-0xRTduPI

It works on a Chevy Bolt because it is CCS.


So what does this mean for a Tesla on an NACS charger? Do they already support CCS over that port, or is it a software update?

I know Teslas were already CCS in Europe, so I wouldn't be surprised if the software is already basically there.


Tesla version 3 superchargers already use CCS for communication between car and charger for 2019 and later vehicles.

Version 2 superchargers do not speak CCS and won't ever be opened to non-Tesla vehicles and there are certain routes where that's quite annoying. My most common road trip has 5 supercharger locations along the way with 3 of them being V2, including the most isolated charger. Even once the NACS changeover happens taking a non-tesla on that journey will be a real pain.

Tesla has not put any resources into converting V2 sites into V3s. Some of the locations have been expanded with the new additions being V3, but I haven't seen much in the way of switchover.


Now that it's open and standardized, you'll probably see tons of third party charging stations with the NACS plug. Tesla's part in bootstrapping charging infra to make EVs viable seems to be mostly done.


Older Teslas (approx 2019 and older) need a hardware update.


Damn, ouch. Seems crazy that they're now in the position of breaking supercharger compatibility with existing cars.


No compatibility break as far as I know. I believe V3 superchargers speak CCS to 2019 and newer cars and fallback to the older Tesla-proprietary communication if necessary.


Sure, but some Tesla owners are going to be surprised when they pull up to a third party charger with a Tesla plug that doesn't work on their car even though it plugs in.


Only ~once.

A little bit of pain when there aren't that many vehicles is fine if it makes things quite a lot better going forward.


I just scheduled the upgrade for my 2018 Model 3. It’s not free, but only costs $280 and they do it in my driveway. Rumor has it I might get a CCS adapter out of the deal too, but I’m not sure. It’s an older car but still going strong, and my biggest concern is that if I don’t do the upgrade now they’ll run out of upgrade kits and the car’s value will be permanently damaged even if it lasts another ten years.


Yes, you get a CCS adapter bundled for no extra charge with the retrofit. I just had mine done.


So retrofit + CCS->NACS adapter for $280 from Tesla cf. NACS->CCS adapter for $225 from GM.


ah, gotcha. Yeah, I can live with broken compatibility with just the newer non-Tesla chargers, especially if the upgrade is as cheap as folks here are saying.


Either EA or EVgo just deployed their first one in the last few weeks.


There's a huge wave of chargers building over the next 24 months. Many will have NACS.


That is the definition of the current state.

Fortunately for the future state, they can be changed.


Not only sideloading, but there are actual Play Store alternativities already available for Android.


Play store also tries to, very annoyingly, update apps that weren't installed from play store


I'm a long time F-Droid user and have never seen this. Under what circumstances?


Installed Nextcloud from F-droid, Play store tries to update it. Nextcloud is also available on play store, and newer versions are available on play store before F-droid.


This is probably when installing an APK of an app that is available on the play store, being the same app id play store will pick it up.

The main use case for doing this is probably revanced, especially for patches like reddit that don't change the app id.


How does this compare to the Parquet S3 Foreign Data Wrapper?

https://github.com/pgspider/parquet_s3_fdw


We talk about it more in our first post for pg_lakehouse: https://blog.paradedb.com/pages/introducing_lakehouse

This FDW does not support predicate pushdown, so performance is weak. pg_lakehouse is the same performance as DuckDB. In addition, we're supporting table formats like Apache Iceberg, more object stores, and have many other plans to improve analytics in Postgres through pg_lakehouse.


Does the S3 FDW support predicate pushdown?


Interesting this post appears in the same week as Paramount pulling the content of much of their archived cable TV content. The internet giveth and the internet taketh away.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: