Hacker News new | past | comments | ask | show | jobs | submit login
Open source 5G core network (github.com/free5gc)
168 points by polymorph1sm on June 5, 2020 | hide | past | favorite | 48 comments



For those interested, this project seems to be a rewrite of https://nextepc.org/ into Go. Alternative projects include https://github.com/facebookincubator/magma and https://github.com/openairinterface.

You can start your own telenetwork with these projects (but of course doing such would be highly illegal without a spectrum license), but first, you need a gNB or eNodeB which is lingo for a base station. It is possible to buy an SDR and use software like https://github.com/srsLTE/srsLTE to make the SDR receive and broadcast LTE/5G connections. After this, you need to have blank SIM cards to code. For example, this post instructs how to do so: https://cyberloginit.com/2018/05/03/build-a-lte-network-with.... You might also consider writing eSIMs https://github.com/bagyenda/njiwa.

After this, you should be able to use a commercial phone to connect to the network.

In general, these networks function quite well even on commodity hardware. Some tests I worked on produced end-to-end latency of around 20ms on LTE, and throughput of around 7MB/s. This was using NextEPC on a 400 euro computer with a 50 euro router and a Nokia base station. Further, it was possible to make the system to work in a Heroku like manner by exposing PaaS endpoints and running Kubernetes in the access network: https://www.semanticscholar.org/paper/Open-source-RANs-in-Pr...

I think it's quite likely that sooner or later we will have 5G networks with intra-services running as edge-services, providing us low-latency. It's worth noting that with the latest LTE versions 5G and LTE access latency is virtually the same. The real latency-optimization would be, at least to my perspective, to allow software to be deployed _into_ these cellular networks to minimize network hops.


I'm not an expert, but there are variants of LTE and 5G (LTE-U and 5G NR-U) that support operating in the unlicensed 5Ghz band.

That said, I'm not sure a commercial handset would connect to anything in that range. You could probably get an SDR-based UE (UE is telecom-speak for "thing that connects to a cell network") to work.


SDR-based UEs work for 4G right now (see srsLTE with the srsUE), but I'm not aware of an open source one that supports 5G standards yet.

There are unlicensed variants of LTE/5G, but worth noting that these often are designed to use the unlicensed spectrum as supplemental downlink (i.e. License Assisted Access (LAA) in LTE), rather than actually allowing for uplink and downlink in the unlicensed spectrum.

The reason for that is simple politics - mobile operators are powerful lobbyists, and part of a powerful standards group (GSMA), and don't really fancy the idea you should be able to run what a customer will perceive as a mobile network without significant investment in spectrum (a finite resource).

LTE-U and LAA won't let you run a network by yourself just on unlicensed spectrum. Multefire should, but is quite rare for the obvious political reasons outlined above.


>but is quite rare for the obvious political reasons outlined above.

Sorry I cant find the part that mention political reasons in OP. I thought Multefire never took off was simply licensing issues.


Oops sorry. Was referring to the above about mobile operator politics, and lobbying power. Mobile operators like to think of themselves as the only ones owning and operating services on IMT spectrum.

The idea of being able to deliver a full mobile service to mobile phones is "sellable" - operators like to think of themselves as being the only ones able to provide the service people expect to a mobile handset. Then they can bundle handsets with service provision.

If you are interested in this topic, it's worth looking at the fraught relationship between mobile operators and WiFi. Operators historically rejected WiFi from handsets early on - WiFi was a rival to their high-price, high-margin mobile data services. Until perhaps the mid-3G days, when the idea of using WiFi for offload, due to the limited spectrum for mobile data started to become a tempting idea for operators.

Even that's controversial - the "enterprise" market around WiFi wasn't hugely keen on that either, since they felt the mobile operators were just trying to snap up and freeload on the license-exempt spectrum for extra capacity, and use up (finite) WiFi spectrum capacity while providing an operator-badged service.

In my view, Multefire hasn't taken off due to the general high complexity of the tech stack people would need to understand to use it (fine for me with a telecoms engineering background, less fine if you are from the pure IT world and just want something quick - it's a lot more effort and complexity than setting up a couple of WiFi APs), and the lack of pressure from mobile operators to support it. Handset support for features comes from operator demand/desire. Multefire isn't something handset makers will add, unless operators demand it. Absent that, it risks alienating or upsetting them, by opening up the handsets to competition, and since operators are the main route to market for your handsets, market dictates the rules...


LTE-U is commercially available and currently in use on a limited scale for wireless internet service. The main (and almost only) manufacturer of base stations and clients is a company called Baicells.

While LTE-U does have some advantages over WiFi for WISP use (e.g. better handling of a large number of clients per AP), the higher cost (around ~$9k for an LTE-U eNodeB compared to as low as $500 for a WISP-grade 5GHz WiFi AP) and scarcity of vendors means that it's not very commonly used. We'll see if that changes over time, but WiFi has a huge amount of inertia in that space so it seems difficult for LTE-U to catch up considering that the power levels are limited to such an extent that it doesn't have a huge range advantage over WiFi with good antennas.

In fact, just looking at Baicells it seems that they offer fewer models for LTE-U than they used to, so they may be finding that sales aren't enough to keep up the product line.


https://translate.google.com/translate?sl=auto&tl=en&u=https...

Apparently, there are overlaps between common LTE bands and both ham radio bands and unlicensed (... low-power) ones... the above page states that any device with the right bands will work (... and they even got them working presumably?)


I remember in the mid 2000s talking with a ham operator. The bands didn't overlap, but they were interleaved - the radios could access bands reserved for cell calls. If you snipped the right resistor off the right radio, then you could listen in on cellphone conversations.


Newer iPhones support band 42 TDD, which is 3400 to 3600 MHz. The amateur radio 9 cm band in the US is 3300 to 3500 MHz, so you could have a setup contained in single ham band. However, the open source eNodeB products typically don't support TDD.


This is the CBRS band, and the idea behind CBRS is that people should be able to gain access to it on an ad-hoc basis by making a request for it, when it isn't being used for marine radar on the US coast.

I believe OpenAirInterface can handle TDD, although it was firmly "research grade" code last time I looked at it.

The positive from CBRS is that it should (or at least is intended to) spawn a new generation of lower cost small cell base stations, using this band, and speaking the CBRS "protocol" for spectrum access coordination. And that has potential to help reduce prices of radio equipment.

Handset compatibility is coming on this band quicker because some existing mobile operators have purchased PALs (priority access licenses) for CBRS spectrum, and intend to use this for some extra capacity.

P.S. just as a very minor technical correction, CBRS is defined for band 48, rather than 42, although with some overlap. B48 is 3550 to 3700 MHz, while B42 is 3.4 to 3.6 GHz as you said. Therefore when looking for devices, it's best to look for B48 (although at a push, if you're doing your own R&D, B42 will be fine for use in the lower 50 MHz section of the band).


Yeah, I'm not an expert either, but there are initiatives to this as well. One such initiative is Nokia Kuha: https://www.kuha.io/

The problem is exactly what you described: there are currently no commercial UE or base stations. So yeah, I suppose one is supposed to use SDR for the 5Ghz band.


That site does not load here. I get a CSP error. sigh, nokia.

https://www.nokia.com/networks/solutions/community-hosted-ne...

is this the same?

there is zero information about nokia kuha! interesting if this can hold more 24/7 broadband clients than wifimax et al.


It seems to be the same thing. The use-case displayed on Nokia's website matches to one of the blog-posts available on kuha.io


thanks, it's loading here now.

It seems that this is aimed exclusively at operators that already have the infrastructure to issue SIM cards et al, and communities that are barred from deploying their own infrastructure but willing to pay for it. A weird use case but clearly aimed at US and AU markets I'd guess.

You need a community that will want to front the cost of laying fiber, and then instead of distributing or starting a small ISP/coop will give that new termination to a telecom giant who will plug this device into their existing billing network to provision their new subscribers with access under the existing plans costs.


You mean sXGP?


Just to add to the above, some countries have very lightweight spectrum access and licensing regimes. Ofcom in the UK let you apply for a low power license (fine for around the home etc.) for a relatively nominal fee, and that will get you usable mobile spectrum - 3.3 MHz in the DECT guardband at the top end of the 1800 MHz band (FDD Band 3), or the 10 MHz slice at the top of the 2300 MHz band (2390 to 2400 MHz, which is TDD band 40).

From memory, a 1-year local spectrum license should cost about £50.


Any recommended resources for learning about mobile communications technologies that aren't overpriced engineering certificates or surface level marketing material for / from telcoms?


Coming from a software background, I think NextEPC and its new variant called Open5GS have the most down-to-earth documentation and code to start from: https://open5gs.org/open5gs/docs/. The authors have many good papers that are easy to follow. Osmocom also has good resources https://osmocom.org/projects/cellular-infrastructure/wiki

Most of the learning curve comes from the lingo. Especially now that 5G introduces softwarization of many of the network functions, the terms used by the cellular and software industries clash and may be confusing at times. But you can definitely learn the high-level idea of how the cellular networks function in this way.

For physical level things, I imagine that a degree in signal processing is required. And for more advanced software interfaces, I am afraid that you can only read about them, as most devices on which the software is implemented is proprietary. The open-source software you can find is in no-way as fully implemented as the actual software which runs commercial networks.


> the terms used by the cellular and software industries clash and may be confusing at times.

lol, I've found a lot of terminology around digital communications seem to be confusing. Terms meaning two or three different things depending on context, etc.



Oh nice, this is really cool. It even has some 802.11 stuff which I was having a hard time finding as well (apparently IEEE used to make the specs available after a certain time, but that appears to have been killed)


That’s basically what AWS has proposed with Wavelength: https://aws.amazon.com/wavelength/


Magma at its initiation extended a hard fork of OAI. The projects have since diverged quite a bit, but some of the core bits are still shared. NextEPC was renamed to open5gs a while back, and a lot of the open source telecom work seems to be happening there https://github.com/open5gs/open5gs . The project was recently reorganized by its founder though, and currently requires a CLA to contribute upstream.


> I think it's quite likely that sooner or later we will have 5G networks with intra-services running as edge-services, providing us low-latency.

Noob question, sorry, but: Why not Wifi?


Mobile networks allow seamless switching of cells. If you're on a call while in the car, your connections won't drop and ideally you wouldn't loose a single packet.

Network and smartphone monitor nearby cells. If you're just about to switch the cell because the quality becomes too bad, everything is already being prepared in the next cell.


With proper configuration, Wi-Fi can do this too. Things like 802.11r, 802.11k, etc allow a station to roam between access points near-instantly.


MacOS + UniFi actually have seamless reconnection.


Because 5G, or even 4G LTE is still technically superior to even latest WiFi 802.11ax.


Only slightly. Most of the difference is telcos paying the $$$ to get their antennas up high with line of sight.


Well Not really. 5G / 4G are far better in noisy, complex multi user environment. You could get higher bandwidth from 4G / 5G then you could in best case WiFi scenario ( Line of Sight ) with equal spectrum.

802.11ax was basically moving some of the LTE tools over to WiFi ( OFDMA ) and completely failed. And pushing back some intended features to 802.11be.


LTE (and 5G on 3.6Ghz, I guess) signal can cover many kilometers. The protocols are built to be more reliable (a phone keeps a tab of access points to which it can quickly hop if access to one base station is lost) and the licensing ensures that there cannot be spectrum suffocation at any time, which is quite common in bigger cities with WiFi.


Seamless handovers are definitely key to mobile networks, and much of the 3GPP technology is designed around this.

Spectrum licensing is more about ensuring exclusive use of spectrum - in a busy built up area (less common during current times hopefully), you'll still suffer from "spectrum suffocation" like with WiFi - if you ever experienced data stalling where your phone indicated a signal, but weren't getting any actual traffic through, you were on a cell that was hitting the limits of the air interface capacity.

The nice part of cellular networks is that you can add more base stations using other licensed spectrum the operator has, in order to get more capacity. So with 3.5 GHz 5G bands, and wider cell widths supported, these can take some of the load in busy places. Otherwise the alternative is to deploy more base stations with lower power on each, to better and more effectively reuse the spectrum.

Since the same operator runs all the equipment on their own spectrum, they can coordinate this much more effectively than ad-hoc multi operator WiFi in a congested area. The downside is the vastly increased complexity of a cellular solution (the sheer number of pages of standards), and the "entrenched player lock-in" available to legacy telecoms operators who already hold spectrum.


In addition to just the total amount of spectrum, coordinated scheduling between towers in the cellular standards helps with cooperative management of interference. Some of this is just now happening in 802.11ax, but without the high-speed non-microwave connections that celltowers in dense areas enjoy to coordinate over. 5G-NR also allows coordinated multipoint transmission, which you can think of kind of like MIMO but from multiple towers instead of just multiple antennas on the same tower. I don't know if this would ever be feasible in consumer WiFi equipment due to the challenges of synchronizing clocks across the different APs.


Can't find where COVID-19 infection code is. Is it not implemented yet?


Let's hope nobody finds out about Presidential Alerts. Basically, it lets you send Tweets to any mobile user.


Fairly straightforward to use these in other core networks. Not sure if this one implements the signalling for it.

Fun fact though - if a phone is without signal on its home network, it will attempt to join any valid 3GPP network it sees, and the cell broadcast/emergency alert will be triggered on the handset (even without a valid SIM) - no authentication takes place.


I think this vulnerability potentially has the biggest real-world impact. This has been shown in a couple of proof-of-concepts… Merging this into public repos would have zero positive impact, I think.


I'm inclined to agree with you there. Anyone that wanted to do it could figure it out anyway from the standards docs and broadcast the right data in the SIBs, but it is probably better not available as a "click and run" type setup.

Although in saying that, this is unlikely to change any time soon, as the idea of CB/PWS is to provide an emergency message that can be highly time sensitive in some scenarios (earthquake, tsunami, etc.) without delays due to authentication etc. Failing to show the message could be higher risk than showing a false message in a very localised area (based on what someone with an SDR can send.)


Somewhat related recent curiosity: Is there anything new about 5G tower/backhaul infrastructure for edge computing that wasn't already feasible for 4G?


Nothing hugely different to "late stage 4G" (Release 14). The main difference with 5G (that was available in Rel14 but not widely used by anyone) is control and user plane separation, which means some network traffic can leave the user plane at the base station, and go elsewhere, rather than be routed all the way back to the core network.

That enables edge computing by allowing traffic for the edge computing node to be routed to the local edge node directly, without it going down the backhaul.

This is very much a software change though in the implementation of the base station.


exactly, control-userplane separation is the primary differentiator for "5g." the whole point is to eliminate backhaul and allow closer CDNs and caches..


Is it? So Netflix is going to install a massive rack of storage in every cell tower, with expensive power and cooling and risk of vandalism and theft? Even if you cached the enterity of netflix, it's still only 10% of Internet traffic.

Backhaul is becoming less and less of a problem as more and more sites get fibre backhaul. Point to point links can easily do 10gig/sec now even if they don't have fibre (to another tower with fibre). That is almost certainly as much if not more than the total capacity of most cell towers even with a stupendous amount of spectrum allocated.


The userplane doesn't run on the cell tower. The mobile network backhaul is multiple stages, and the tower is just the first hop.

The problem with backhaul isn't the availability of fiber, it's the _concentration of bandwidth_.


So will this let me run a 5G baseband? I've read about Osmocom but I've never tried it out.


Not quite, this is the "core network" which handles things like handset authz/authn, billing, base station attachement etc.

A 5G basestation is called a gNB, and a handset is called a UE. This project doesn't involve the radio interface of 5G, so neither of those are included. There's another comment on the post outlining how you might go about standing up an actual network using this.


Is this really what constitutes a "5G core network"? Does it mean that I can buy Huawei for the rest of it and set it up in the UK?


This is a few components that make up the very baseline of some aspects of a standalone 5G core.

If you want to add commercial base stations to it, you'll find you're missing a few non-standardised things missing from all "open source" solutions - you'd need base station orchestration and control, to handle configuring and managing the base stations and getting them to behave in the ways you want.

For one or two base stations you might cope without it, if you can figure out how to configure the radios, and they happen to work. But generally, there's a load of integration to be done that isn't part of the standards, and that you'll need in addition to this. You'll also need to handle scaling up and clustering some of the components that need to scale up with traffic.


This won't have a chance to win unless e.g. Cisco/Google/Apple/FB somehow decides to fund a few thousand developers each yearly. And they won't do that unless there's clear incentive for them.

In a way I really hope they won't. How could anything good come out of that?

From an engineering perspective: the software stack is conceptually (if you squint enough) a bit like webkit/chromium, but on top of that you also need cutting edge radio/antenna development.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: