I think any certification that requires a plan be in place for future support is going to be either too weak (if the company goes under, support is gone) or too burdensome (requiring some trust be in place for future support should the company go under is very inefficient). Just allow more control by the user. This lowers barriers to entry for the market, and allows for interesting and unique future uses.
As for non-standard methods of configuration and support, I see no problem with requiring your own software if you want it to be supported, and requiring it be set to default levels (or a verification that no values are outside norms) before a support incident is allowed. You want to permanently log whether any settings have been set that void your warranty? Go ahead. This is a solved problem. Try to get Apple to fix your phone if the little water sensor inside has been tripped.
I see no problem certifying hardware, I'm just not sure how something like UL for software works in practice.
We likely have different ideas of what constitutes future support. You seem to want the A+ package. I think it is sufficient that consumers get a pro-rated amount of their money back, depending on time since purchase. That can be done similar to insurance or escrow. (Although that is still a form a trust for future support.) If you buy something for $100 and 24 months later they pull service, should you get back $10 or $100? Only the smaller end seems viable to me.
> Just allow more control by the user
Given infinite people, time and money that is easy. I can't see any way of it being practical though. It is considerably more engineering effort, considerably more support effort (all that stuff has to be documented, tested and revised), will increase time to market etc. The only winners are those folks still using the device when service is pulled!
I'm sceptical that generic third party configuration could be reasonably done well. Is there any example of something similar? SNMP was mentioned as a failure by another commenter.
Certifying the software could be done like some other certification is done:
* Publish the list of best practises (eg authentication, update mechanisms, how they participate in the network, long term service)
* Have auditors (yeah they would need to be certified too) that will come in and examine a product/code base/service and give their blessing for relevant best practises. I'd imagine some open source libraries would end up certified too (eg an updater, or authentication) so a quicker path to product certification would be use already certified components.
* Award the certification based on auditor report
Yes it could be gamed, there would be wiggle room, bad actors etc. But it would be a lot better than we have now, and a feedback loop would help improve.
BTW I am working on a product somewhat in the space. I'd love to make it completely open, available for ever, plugin your own servers etc. But it isn't remotely viable for people, time and money reasons.
I am not by any means an engineer so you can most definitely take what I say with a grain of salt. But my understanding is that all that really needs to happen for IoT devices to be usable by a third party is for the manufacturer to simply publish the API and provide a mechanism for the end user to control what interfaces with the API. It doesn't actually require you to standardise the API as you have never guaranteed compatibility with any third party it simply requires documentation, which would have to already exist internally for development purposes, and on some level some loss of control. But surely, this way you then get to maintain control and development for as long as you want to support it and for those people that then want to support it themselves after you have abandoned it can as API is a known entity. I am probably fundamentally misunderstanding an aspect of the development of this somewhere but this does not seem that burdensome to me I don't even see the necessity of a standard that needs maintaining here just documentation of the API and a mechanism to change where the data is sent. Hell if you want to maintain control for the life of the device fine, but what is so hard about releasing that information when you no longer want anything to do with it, what possible commercial value does that information hold then?
> ... for the manufacturer to simply publish the API ...
That is more work than not publishing the API. How much more work then? I'd pick stats from The Mythical Man Month where they claimed that doing something for external consumption is nine times the effort than internal only. Admittedly it does predate AWS, containers etc, but then again those aren't that useful for iot.
Ok makes some sense but why not just provide the API and a mechanism to change where data is sent after you have decided not to support the device anymore? Surely that is not to burdensome and will give someone out there the option to continue using the device (and probably save some goodwill over the dropping of support)?
No, I'm fully on board with zero support, given that's communicated up front. Let the market sort out whether people think that's a good idea. I care less about certifying devices to some standard level of quality, mainly because we're so far away from knowing what that is, than I am in opening up the ecosystem to fight planned (and negligent) obsolescence.
> I'm sceptical that generic third party configuration could be reasonably done well. ... SNMP was mentioned as a failure by another commenter.
SNMP is still used quite heavily in network devices. It's painful because it's complicated, but it's still one of the standards you can rely on any enterprise network device to support (as I understand it, I've dealt with that less the last five years or so), at least for the reading of information. In that respect, it's a success story. That said, I imagine the average network device in that category might be an order of magnitude or harder to deal with configuring (physical connector options, link level options, protocol options, route control and propagation, access lists, etc).
> Is there any example of something similar?
IP, TCP, UDP, POP3, IMAP, HTTP. The core protocols started as free standards, but for mail, compare to exchange. Exchange persists still due to its advanced capabilities and tight integration with other Microsoft products, but it's not what ended up being used for integration in the current generation of mail programs. They all support IMAP and/or POP3, because why write your own tools to migrate mail or provide third party access when a suitable protocol and suite of tools exist.
> BTW I am working on a product somewhat in the space. I'd love to make it completely open, available for ever, plugin your own servers etc. But it isn't remotely viable for people, time and money reasons.
Yeah, and I'm not arguing that you should do this. But if a standard existed, and there were free or cheap libraries that did most the heavy lifting for you, it's possibly that would make it the more viable choice for those same reasons. E.g. A light weight HTTP server with route engine to hook into your own libraries,or even to just write to a config file and HUP your daemon.
As for non-standard methods of configuration and support, I see no problem with requiring your own software if you want it to be supported, and requiring it be set to default levels (or a verification that no values are outside norms) before a support incident is allowed. You want to permanently log whether any settings have been set that void your warranty? Go ahead. This is a solved problem. Try to get Apple to fix your phone if the little water sensor inside has been tripped.
I see no problem certifying hardware, I'm just not sure how something like UL for software works in practice.