I moved away from the .NET world less than three years ago and this announcement might as well be written in Klingon or ancient Greek to me.
Now, keep in mind that I actually thought (and still think?) that .NET and C# make for a very productive development environment. I didn't leave due to antipathy; no axe to grind here.
.NET Core is side by side and it’s moving fast. It’s WAY faster
than .NET (Full) Framework can move which is good. By building
ASP.NET Core 2.0 on top of .NET Core 2.0 (which, remember, is a
SUPERSET of .NET Standard) that means stuff can be built faster
than NetFx or even Netstandard.
.NET Core? .NET Full? .NET Standard? .NETfx?
And .NET Core is a superset of .NET Standard? How does that make sense? "Core" sounds like a stripped-down version of... something? But, apparently it's a superset. Because nothing makes sense any more.
My only point here is: at first glance, this is an utterly bewildering set of choices. I'm sure there's a pretty simple relationship between all these things, but the .NET people probably aren't doing themselves any favors this these naming choices and (what appears to be) fragmentation between .NET development targets.
While I hate to interfere with any Microsoft bashing, the use of system32 is actually because too many 3rd party apps didn't bother calling "GetSystemDirectory()" and just assumed it was system32. And Wow64 is Windows On Windows 64, a subsystem for making non 64 bit apps run on 64 bit Windows. So that one will remain properly named until SysWow128 comes about.
Sure, there is a valid reason (it's not like they got drunk and made this all up). But the result is completely backwards from what everyone would expect it to mean. They might as well have named it FHJAL3481JF because at least it means nothing, instead of the opposite of what everyone expects. Sys32OnWin64 might have been a better choice.
It isn't a valid assumption (nor expectation) that MS would design it so that anyone not skilled in Windows development would understand it. If you want to see a nightmare, look at a *nix file system. If it were a contest, I'm sure most people would give the point to Microsoft on this one.
.NET Standard is a standard that the 3 different implementations of .NET try implement (.NET, .NET Core and Mono). Any implementation of .NET Standard can add in extra functionality, hence superset, it's a bit silly.
>.NET Standard is a standard that the 3 different implementations of .NET try implement (.NET, .NET Core and Mono). Any implementation of .NET Standard can add in extra functionality, hence superset, it's a bit silly.
It's actually pretty smart if it's implemented correctly (Which it's not, there is another article floating around the moment which says that .net standard isn't working correctly because people pick and choose what to implement anyway).
So the .net standard is essentially like "interfaces" in code. It says, if you want to say you are .net standard 1.6 you must implement these things.
When you go to write a library and you want to release it to the masses. You would normally have to go "OK, do I want to write this for .net core or .net framework?". With .net standard you can write it in a way that you know that a call to some method will always be there no matter what platform is actually running the library.
I always loved .NET Framework, but I agree, some of the names are confusing and don't make entire sense later on. Like if I want to code ASP .NET Core in SublimeText 3 the plugin mentions ASP .NET vNext, which I'm honestly not sure if that was just code for ASP .NET Core or some abandoned version of .NET (the fact it's not even in your post...) and it's sad. The only one's I know currently are:
* .NET Core - a new spin on .NET Framework for the cloud and such, not always compatible with .NET Framework but sometimes compatible.
* .NET Framework (or "Full") - the .NET Framework we've known and loved, what was a few years ago merely known as .NET
* .NET Standard - the bridge between the two, made after .NET Core so that they could break backwards compatibility to some extent, or they realized "whoops we gotta make a compatibility layer" or some other reason...
I'm not even sure what .NETfx is, nor what ASP .NET vNext was, and if it was just an alpha, all I know is in Sublime Text I don't have sane / easy options. On Visual Studio Code I can just install the "C#" plugin on the other hand and get going. It's not as good as Visual Studio's IntelliSense though.
One nice thing about .NET Core is the ability to develop from any platform and to deploy to any platform. .NET Framework needed Mono for that, I wonder how much of .NET Core will become Mono Core or something similar (if at all?).
So am I right in assuming .NETfx and .NET Full are the same? Or... Yeah I still love .NET not a fan of the weird transition phase, I can't help but think that Core should of been stuck into Alpha for this time frame, but then people would complain it's ready for prime.
Plus, .NET Core is the first developed openly on GitHub as open source, so you get the teething pains of seeing how all the sausage is made. On the one hand this is great to see so much transparency and open source collaboration. On the other hand, for a lot of .NET veterans used to only hearing about .NET updates maybe once every three years at a well rehearsed developer conference, the world of open source transparency is definitely a fire hose of information updates and that makes for a very rough transition.
I too am actively moving away from .NET and I'm moving to Elixir. Not expecting to be able to leave my .NET job any time soon but I'm still transitioning for hobby projects.
.NET standard looks like a description of features, but not its implementation. It's the standard to which all other .NET implementations are measured by.
.NET core is the new guy in town that is cross platform. ASP.NET Core is the web framework built on top of .NET core.
.NET Core is called a superset of .NET Standard because it implements everything and more.
.NET Standard - interface/API definition, not implementation, that specifies what functionality is supported at what version number.
.NET Framework - older desktop/windows-only that implements the .NET Standard. Various other smaller frameworks created over the years for mobile, embedded, etc.
.NET Core - newest cross-platform framework that implements the .NET Standard, although with less coverage than the full framework because it's new, but .NET Core 2.0 will have much more in parity and will eventually outpace.
ASP.NET - web framework built on top of the .NET Framework that offers webforms, mvc, web api, razor views, etc related to making webapps.
ASP.NET Core - web framework built on top of the .NET Core framework that offers a faster, streamlined pipeline/middleware model with mvc/web api, razor views, SPA services and integration, etc.
Yes Microsoft is one of the worst companies at naming -- however it makes perfect sense that ASP.NET Core (the web framework) targets .NET Core (the underlying cross-platform framework) which itself implements .NET Standard.
It's a case of superset functionality being layered on top. Eventually the .NET Core code will move faster and outpace .NET Framework, so you will be able to run .NET Framework libraries inside .NET Core apps but not run .NET Core apps inside .NET Framework.
> Yes Microsoft is one of the worst companies at naming
I think you're being too generous. Microsoft branded it's Office Suite, BA Suite and Operating system .NET once upon a time. They'd have named their own variety of orange juice OJ.NET if they'd had one.
Every major company has naming problems from Google to Oracle to IBM and more. It's just teams, bureaucracy and marketing forces all fighting against each other.
I do think Microsoft could just hire a VP of Naming who's in charge of all names and improve productivity by billions but alas, this is where we are today. However, once the names become familiar, the actual issues are not nearly as dire as made out to be.
I was looking into Microsofts ERP options, it was so incredibly frustrating to understand and compare their different offerings that I just gave up, but they did have an option to contact their sales department, which was probably their plan all along.
Don't forget that ASP.NET Core can target not only .NET Core but also .NET Framework. We had to target .NET Framework in our latest ASP.NET Core web app because we still need access to COM+/Interop.
This is exactly the issue being discussed here - ASP.NET Core will not run on .NET Framework in the future.
Today .NET Core is younger and smaller so it's easy to also run .NET Core apps on .NET Framework since .NET Framework is larger and has everything implemented already - but as we progress with new APIs in .NET Standard, .NET Framework will continue to fall behind (until it gets those major multi-year updates). So you can still use older ASP.NET Core versions (like today's 1.x) on .NET Framework but eventually the newer versions will be too far ahead of the .NET Framework.
However, .NET Core is cross-platform and runs just fine on windows, brings its own dependencies, and will be able to use .NET Framework libraries so this whole issue is overblown. System.Drawing, ActiveDirectory, and other APIs will be added to .NET Core soon enough. If you really need COM interop, then either just stick with .NET Framework apps or use older .NET Core apps.
".NET Framework" made sense when there was only one real framework available, but there is now Core, Compact, Mono, Unity, etc. which makes it confusing.
".NET Desktop" can be used instead, although it's also misleading since it has nothing to do with desktops. ".NET Full Framework" is also used often.
Perhaps the best name would be ".NET Legacy Framework" with the understanding that it is a heavy, slow-moving framework implementation that only runs on Windows (and is limited by the Windows version as well).
I sometimes feel like I need a PhD to understand Microsoft's overarching .NET philosophy and how it all interoperates. It's strange, because the systems and what they are supposed to be able to do in terms of computing, human computer interfaces, and networking is largely unchanged since decades back. What is going on? Isn't the only major change recently that mobile devices have become more popular targets and web development is more useful due to web browsers being cross-platform renderers?
It's incredibly frustrating. I've been on the .NET wagon since ASP.NET 1.0 (and ASP before that), and my day to day friction with .NET has been increasing steadily over the last few years as more and more of the eco system starts to support "core".
Hosting in IIS with .NET Core is no different than before, it all works the same. Only difference is that IIS won't be running the code but just acting as a simple proxy so you turn off the "Managed Code" setting in the application pool for that site. Than just web-deploy like before and everything works.
All those other settings are settings that were there before.
My God, is this ever going to settle down?! I skipped all the .NET Core 1.x fiasco and decided to wait for .NET Standard 2.0, as it seemed like they learned the lesson, but now it starts all over again.
Ah, the old debate in .Net between the lumpers and the splitters.
Between the "just install .Net and that's it, it makes no sense to install all these little interdependent pieces one at a time. it's too complex and versioning is hell" and the "this web server doesn't need the WPF libraries, it's just dead weight, bloat, and makes it harder to start up new machines. It makes no sense to bring out a new version of WPF because there's a bug fix in ASP. Or worse, delay release."
The thing is, there is no one right answer, only opposing forces that have to be reconciled.
.Net core started out fairly radical, which favoured the early adopters and neophiles who tend towards splitters, but is now generally veering towards larger corporate adoption, which tends to wards lumpers. But there are exceptions.
I don't think anyone really is angered by a modular system, its more the needing-to-pay-attention-to-every-little-bit (especially when those bits are mixed up in really poor naming, corporate speak, etc) is the more practical issue
> Between the "just install .Net and that's it [...]" and the "this web server doesn't need the WPF libraries, it's just dead weight [...]"
> The thing is, there is no one right answer, only opposing forces that have to be reconciled.
Really? I thought this was a mostly solved problem. I certainly see it as such
when I use modern packaging systems that were developed almost two decades
ago.
> I certainly see it as such when I use modern packaging systems that were developed almost two decades ago.
I'm not sure if you have something specific to say about the nuget package management system; how it was used to deliver packages in .Net core and how and why the strategy has changed from delivering the framework as lots of small packages to delivering it as few larger packages; or if you are just being flippant and ignorant.
I would certainly go with ignorant. I stay away from Windows and, in
consequence, from .NET. I've heard about NuGet, but I haven't used it, so
I don't know if it is any good.
On the other hand, I've seen both strategies (large and small packages) in
use, and the better dependency resolution mechanisms and package building
tools, the better small packages work. I don't need to install plethora of
GNOME and Python packages, I just run `apt-get install exaile' and have it
installed. From programmer's perspective this works equally well; I may want
to include some specific part of Boost library, but I can also run `apt-get
install liboost-all-dev' and be done, and leave detecting the actual
dependencies to package builder's tooling.
NuGet is okay, in my opinion. My two biggest issues with it aren't actually issues with NuGet, but with the surrounding ecosystem.
First is that MSBuild hard-codes references, so building a multi-targeted project with references to a multi-targeted NuGet package is a nightmare. From what I've read in the documentation, Paket (an alternative package manager) is supposed to be designed to help with this, but it was already enough of a struggle to introduce NuGet at my work, and NuGet is "built-in" to Visual Studio.
Second is the fragmentation of the .NET frameworks, a la this article. Combined with how .NET makes native interop fairly easy (i.e. platform dependent packages), you get either bloated monolith packages or a proliferation of satellite packages.
Other than that, I've had a fairly easy and straightforward time with NuGet. But then again, I've never seen a project with more than a dozen or so dependencies, compared to using Maven in Java where you pull in dependencies like hotcakes.
I'm split. I'd love to see the churn settle down so the surrounding ecosystem can mature. On the other hand, there are definite pain points with the existing tooling. I remember seeing declarative package references as a feature request for Roslyn, which would be absolutely wonderful, but even if it's released fairly soon, it's probably going to take years for adoption to spread.
> First is that MSBuild hard-codes references, so building a multi-targeted project with references to a multi-targeted NuGet package is a nightmare. From what I've read in the documentation, Paket (an alternative package manager) is supposed to be designed to help with this, but it was already enough of a struggle to introduce NuGet at my work, and NuGet is "built-in" to Visual Studio.
> Second is the fragmentation of the .NET frameworks [...] [you get monoliths] or a proliferation of satellite packages.
And I don't see why "proliferation of satellite packages" is supposed to be
bad. It works in Debian's APT, including development packages. Maybe somehow
NuGet is not okay, after all? Certainly something is different between the
two.
You claimed that "there is no one right answer" to distributing code (or so I understood), which is false, because it works well in the world outside Windows.
That's just looking at the tip of the iceberg, I think. It also improves everyone's ability to patch their applications, although with VM and container automation, it's not as big of a deal anymore.
That's actually one thing I don't understand. Having a full framework installed on the machine means all the modules will be updated the .net framework. Shipping with your own binaries means you will need to do the upgrade yourself. That's fine if you have an active team maintaining the application. But in the real world there are many many apps and websites developed once and rarely upgraded. These will not receive security patches if they shipped with their own binaries. I am not convinced this is a benefit.
Your real world is not necessarily the same as everyone else's real world, and in some cases there is great benefit in being able to say "I am going to update the app that I own, and I can guarantee that it won't affect the app that you own, because they both ship with their own binaries, not a shared, system-wide framework."
There are other cases e.g. using EC2 Vms in AWS where it's moot; machines come and go, there is only ever one app per machine. So installing per machine or per app are both 1 install, except that the per-app install has the potential to be slimmer not 1-size fits all.
In short, it's more friendly to new scenarios involving clouds and containers.
And I can see how that will be useful in the case of an active team deploying a service continuously. But I am ready to bet it will also be used (because it's the cool new technology) by lots of people who will create something, perhaps with the intention to maintain it (or not) then move on, and that's how you end up with unpatched software online and sensitive data leaked.
They're better at "download this and everything will work" than any equivalent I've found in the ecosystem for other languages (though elixir/phoenix is there, too). I'm impressed (as a .net newcomer).
With postgres support through npgsql, I'm doing my latest side project in C#. We'll see how it goes.
The standard .NET Framework has been settled and stable for a decade, and it's still completely valid for use.
.NET Core is a new project with a vast amount of reworking and as usual, version 2.0 tends to be the best release to wait for. Considering it's not even out yet, not sure what all the outrage is for.
This is what Scott Hanselman wrote, please read that first:
==
I can see why this is initially a scary WTF moment. Let me explain because it’s less freaky than it seems.
You said .NET customers are going to need to interoperate. Totally agree.
We can share netstandard libraries between ASP.NET Core 2.0 and EVERYWHERE.
We can even reference many net461+ assemblies from ASP.NET Core 2.0 because of typeforwarding and netstandard20
You said WebApps may need to use:
AD – Totally, this is a gap IF you want to call LDAP directly. You can certainly auth against Windows Auth NOW. We plan to have specifically the DirectoryServices namespace for Core 2.0 around summer timeframe
Drawing – Totally, this is a gap. We plan to have this for Core 2.0 around summer timeframe. Until this, these netstandard options also exist ImageSharp, ImageResizer, Mono options, etc
COM Automation – This has never been possible under Core 2.0, but you can certainly PInvoke if you want to hurt yourself. You could also local WebAPI to a net461+ process if you really want to hurt yourself.
Sharing code with WFP Apps – YES. Totally possible with netstandard2.0.
This is a weird change to make.
Feels like it but…
Think about it this way. WPF isn’t netstandard2.0, it knows it’s on net461+ and that’s OK. It’s optimized, but it can reference netstandard libs. ASP.NET Core 2.0 is optimize for Core 2.0 but it can reference shared libraries. Xamarin is the same.
.NET Core is side by side and it’s moving fast. It’s WAY faster than .NET (Full) Framework can move which is good. By building ASP.NET Core 2.0 on top of .NET Core 2.0 (which, remember, is a SUPERSET of .NET Standard) that means stuff can be built faster than NetFx or even Netstandard.
NetCore > Net Standard > NetFx when it comes to development speed and innovation.
Point is, if you are doing new work, netstandard20. If you have older net461+ libraries, MOST of those can be referenced under ASP.NET Core 2.0.
ASP.NET Core 1.1 which runs on .NET Framework will be fully supported for a year after we release 2.0. That workload is fully supported thru at least July of 2018.
The remaining large gaps missing in .NET Core 2 are System.DirectoryServices and System.Drawing. We are working to have a Windows compat pack which would enable both of those on .NET Core on Windows this summer.
What we need from you all is a clear list/understanding of WHY you think you need ASP.NET Core 2.0 to run on net461+. Be specific so we can close those gaps and let everyone be successful.
Boys! We've been caught. Quick, do Jazz hands and pretend it doesn't matter!
That does not seem like a sane response to me. We move fast and break things and that's good is not something we should be hearing as a justification at this point.
And he's making a deal out of it being supported till 2018! A whole year! What do they think people are making on their framework? Apps that disappear after a year? Once you commit to a framework you'll be supporting it for 5 or 6 years.
Am I totally misreading it, as to me that is really not a reassuring response at all, quite the opposite. It seems to me that they've completely lost touch with their customers who want a stable, fast and predictable new version of MVC 4.
We built on top of ASP.NET Core because of the option to use it with netfx, and we built things which were intended to last 5-10 years, not for 1 year of support max. This is what the ASP.NET name means in business and it's why large, slow-moving organisations choose it over flavour of the month frameworks.
They should take a look at MS history and look back at the Windows Mobile (6) -> Windows Phone transition for what can happen worst case when you crap all over your enterprise customers.
Have you tried using it in VS2017? Nowhere to be seen. The runtime is supported yes, but you try combining and debugging a Silverlight solution with a .Net Core web app.
Of course - I was actually surprised it still worked in VS2015.
I don't expect it to be forwards compatible. (Hell, we're talking about tech that only runs in IE11 :) )
But one big difference is: Official support means updates in case of critical vulnerabilities and guarantees that, at least with IE11, it won't simply stop working after an update.
ASP.Net Core 1.x won't have any such guarantees after 2018.
And that's kinda hefty when the ASP.NET Core website (https://docs.microsoft.com/en-us/aspnet/core/) still refers to this page for guidelines as to which .NET Runtime to choose for ASP.Net Core(!) projects:
This is still splitting the ecosystem. This is the exact same mistake they made with the json project format.
Also, if you started using asp.net core 1.0, and you do actually do need stuff that's .net 4+ only, you're stuck.
There are a ton of people/businesses that use old .net libraries, that now have no way forward, no way to gradually introduce asp .net core. I'm going to assume that, if this is a new policy, that all the other new core stuff (entity framework core, async enumerable) might also become core only, which would be an even bigger issue.
Also, I can't think of a good reason this happen. Are they trying to force people to .net core? Was supporting two .net versions that big of a burden?
It certainly runs on .NET Core (I'm currently using it that way for a project); it just also runs on full-fat .NET, too. ASP.NET Core seems to break with that and only run on .NET Core.
That isn't the issue, the issue is that when I play with things I restrict myself to playing with things that I can potentially use in production at some point, I don't have a lot of 'play time' so it makes sense to prioritise things that have a potential return.
.NET Core is totally fine for playing around or for small apps and services.
It's usually for large projects that these backwards compatibility issues tend to crop up and be show-stoppers. You're likely better off with Mono in that case.
We've built our entire backend on .NET Core. We've got a lot of moving parts. The main pain points have ironically been certain Azure libraries which haven't been ported to the standard yet. I've also been using F# core and while the experience isn't yet there, it feels close.
He's also saying that "semantic versioning" is a recipe for breaking people's software and that if you're going to break things, then you should change the name / namespace. I agree with him.
Problem is not semantic version but .Net platform that have multiple branches of the same thing[0]. If MS was not braindead they would promote Mono to become .Net Core instead of building a new thing.
Hickey's Clojure introduce breaking changes even in minor versions[1] how this is more sensible?.
Even though its history isn't perfect, Clojure's changes are not breaking and it's one of those projects that sets a high bar for what backwards compatibility means.
Clojure code written for 1.0 (May 2009) works just fine in Clojure 1.8 and you'd find it challenging to find an item in that linked document that actually breaks compatibility.
It's an unfortunate side effect of the limited way some ecosystems handle multi-version dependencies; I would not blame this on semantic versioning (which clearly states which changes are to be considered breaking). With OSGi I can use one module that uses an older version and one module that uses a newer version and use these together in an application (as long as the dependencies are not exposed, which they unfortunately will be in almost all real-world code).
I recently started a new project in .NET Core because i really like C# (mostly from working with Unity) but i am pretty sure that i won't use it again for some time. I naively believed i could now use the .NET ecosystem on a small microservice running on linux, but i quickly learned that support for .NET Core has to be actively built by maintainers of libs and the current choice isn't great, documentation is lacking and there is a lot of confusion like the issue linked to here or the whole project.json vs csproj debacle.
I use NancyFX on the old .NET/Mono for that purpose and it works well. I keep hoping to switch to Core and it keeps not happening, but the old way seems to have a lot of life left in it.
This is what we're doing. Using .NET Framework and not having to pay energy to following this the past three years has been truly liberating. It's an as powerful framework as ever, perfectly suitable for modern web development and whatnot. Yes, you miss out on cross platform support from .NET Core, but I feel like that hasn't even begun taking off yet and I'd argue that using e.g. Go or Python might make you a happier human being in that case anyway, or at least with a more peaceful, productive state of mind.
A increasing amount of Mono is .NET Core. The wonders of open source is that you can do the effort once and share it. Eventually there will be no Mono, only .NET Core.
We wrote a .NET Core microservice a few months back and there were problems upgrading from 1.0 t 1.1 (or some minor point like that). I scraped it and rewrote it in node.js and now looking at this, I'm glad I did it.
I just don't see .NET Core having any stability. Meanwhile, node just landed promises in core[0] a few minutes ago.
I understand what .Net standard and .Net core are but I think they need a better strategy for their nomenclature that using .Net in the name of every runtime, platform and set of libraries.
The first thing they need to change for .Net Core is the name. "Core" is such a meaningless term. Call it something like .Net Multiplatform or something.
The issue is about Microsoft having just decided to drop support for being able to run ASP.NET Core 2.0 on the full .NET Framework. The current plan is to drop support for .NET v4.x after the current ASP.NET Core 1.x release:
well I think a big minus in the ecosystem is that many projects (including something popular like NHibernate) actually pulled themselfs too much into the framework, they didn't created their own (which of course wasn't/isn't bad) but it tangled it, with a lot of implementation from microsoft (which microsoft now breaks).
I always was very curious why that happened. I mean .NET has something like DataTable/DataReader, classes that were fully!!! implemented. In Java for SQL access you just gotten some interfaces and a Socket, the rest was up to you. (and somehow that worked, we have drivers for every major database in Java). Actually besides some java.io.InputStream/java.io.Reader there weren't such high level classes used/are used in these drivers. I mean since java8 we now have some nio classes that makes byte manipulation a lot of easier, but most stuff is still implemented by the driver and not any framework at all. (even ResultSet/Connection/DataSource is just an interface)
Usually the culture on the .NET side of the fence tends to be more practical code oriented for quick solutions to daily use cases, than the enterprise solutions targeted to every possible use case one might eventually encounter.
I love working on both sides, but sometimes we get into these kind of issues.
On Java side I never liked that Sun engineers never grasped the expectations of what being a desktop developer actually means (AWT, Swing, Java 3D, JAI, JDI...). Which is an area, where Oracle actually managed to handle slightly better, but lets see.
well I wouldn't say that is a enterprise pattern, more like a "big online shop with dozens of users"-pattern.
When your users are below a certain threshold CQRS/Event Sourcing that pattern would be a extreme overhead.
(And I'm not sure if .NET made them popular, the only thing I know is that microsoft actually has very good documentation about them and martin fowler actually made good block posts about it in early as of 2005 or so, but the first time I heard of it was probably in Object-Oriented Software Construction (Java Book))
I also find this extremely confusing, even after using .NET since 1.0.
I wish they would make a chart, diagram...call it what you may, on MSDN or the Github repo, indicating all the frameworks, a description and what developers should target when using one e.g. Desktop, Web or API, Mobile, Cloud etc.
That way, when comes the time to develop an app, we reference the chart and go from there.
I actually quit using .NET entirely because Mono is a trainwreck while this stuff is being sorted. Half of the libraries I'm used to spewed esoteric errors about framework mismatching and such last time I tried. The new stuff from MS is difficult to use except on a few blessed Linux distros and Mono is slow on the uptake.
As a .net Developer this also doesn't make much sense to me. I do know that I can keep firing up Visual Studio and keep working on my existing applications without this stuff hurting me. Also switching between .net versions will be fairly painless since they're all really similar. The only hard thing is understanding if you can port legacy app x to new stack y.
Nothing that would stop me from using .net.
Microsoft is really good to keep developers dangling. Can / should I still use WPF? Did it secretly continue under a different name?
This issue seems to mostly be (healthy) debate on that "can you port legacy app x to new stack y" front. It should (eventually) resolve to most people's satisfaction, but the sausage is being made in public on GitHub and that's new and frightening to some .NET veterans.
«Can / should I still use WPF? Did it secretly continue under a different name?»
You can still use WPF if you are happy with it. The Universal Windows Platform (UWP)'s .NET/XAML stack is the relatively open and acknowledged successor to WPF. It should be quite familiar to existing WPF developers and offers some nice new features and performance. Porting to UWP is still sometimes tough from WPF, but Project Centennial/the Desktop Bridge can make it a lot easier to do the transition a piece at a time.
I started using .NET for one, and only one reason: interop. Now that they're gutting interop, I want to jump ship, except there aren't any alternatives!
This seems like FUD to me. (just skimmed the thread on mobile)
First, this is about a preview version only.
Second: Moving Asp.Core 2.0 target to Netstandard 2.0 is a sensible step to me. This version will be supported by desktop .Net 4.7 (win7+) when the standard is RTM. This will be the point where depending technologies will be RTMd as well. We should revisit this discussion then.
> Moving Asp.Core 2.0 target to Netstandard 2.0 is a sensible step to me.
Read the issue carefully, they've done the exact opposite and have started changing ASP.NET Core 2.0 packages from targeting .NET Standard to now only target .NET Core netcoreapp2.0 so they will no longer be able to be referenced from the full .NET Framework.
They've also just announced that ASP.NET Core 1.x will be the last version that will be able to run on the Desktop CLR:
To be fair to the original commentor: All these 2.0 names (.NET standard 2.0, .NET Core 2.0 and ASP.NET Core 2.0) make it hard to understand what sometimes is talked about. I also first understood that ASP.NET Core 2.0 is now dependent on .NET standard 2.0. This have caused no problems for desktop, because .NET FW 4.6 implements .NET standard 2.0 as far as I understand.
Doesn't that make sense? Why would you want to run a core 2 app on the desktop CLR instead of distributing a binary? Not having suitable replacements for a lot of existing libraries is the bad part.
The issues people have with the move largely revolve around applications that need .NET Framework (such as WPF applications) which embed network servers currently built on ASP.NET Core 1.x, which runs happily in that scenario, but 2.0 won't and 1.x support isn't going to be around for more than a year after 2.0 drops.
WPF will probably never work on .NET Core, so after that they're going to have to have some kind of elaborate multiprocess solution if they want to stay in supported versions.
.NET Core 2.0 should run most libraries from every verson of the .NET Framework 1.0 to 4.6.1 (and potentially higher), with very few exceptions (albeit in some cases literally exceptions). (That's all the work that has gone into defining the .NET Standard 2.0 API.)
Yes, I agree with the guy for example needing NHibernate. It's a pretty big deal in the ASP .NET world. Anyway, here's an issue tracker for that one: https://nhibernate.jira.com/browse/NH-3807
I also agree with the opinion that ASP .NET Core shouldn't have run on the full framework to begin with to set the right expectations. We seem to be in yet a transitionary period by Microsoft and if the point is to decouple it from .NET Framework, it shouldn't have had legs in it.
Now, keep in mind that I actually thought (and still think?) that .NET and C# make for a very productive development environment. I didn't leave due to antipathy; no axe to grind here.
But...
"netcoreapp2.0"? "netstandard1"? "net4"? Huh? What?
From Scott Hanselman's reply:
.NET Core? .NET Full? .NET Standard? .NETfx?And .NET Core is a superset of .NET Standard? How does that make sense? "Core" sounds like a stripped-down version of... something? But, apparently it's a superset. Because nothing makes sense any more.
My only point here is: at first glance, this is an utterly bewildering set of choices. I'm sure there's a pretty simple relationship between all these things, but the .NET people probably aren't doing themselves any favors this these naming choices and (what appears to be) fragmentation between .NET development targets.