Don't get me started on that... 'majority', and I guess anyone who matters, has an Apple device.
![Pensive face :pensive: 😔](https://cdn.jsdelivr.net/joypixels/assets/8.0/png/unicode/64/1f614.png)
Last edited:
Don't get me started on that... 'majority', and I guess anyone who matters, has an Apple device.
Change the password on the 2.4 ghz network and make sure only 2go uses it.On an aside, I just found out that my galaxy note 9 locks on to the 2.ghz band when I'm outside on the lawn but won't switch back to the 5ghz band when I'm back in the house. This adds to the 2go's streaming woes. The only way to remedy this is to toggle the WiFi off/on in the phone settings...
Man, Chord is making us Work.
Sorry to burst your bubble pal but rbr50 don’t work... testing in a single user environment is NOT QAOk apologies for the long posts but I'm going to try and answer everything in one go as
at the moment I am currently unable to be on Head-Fi as I would have previously been.
No one can deny that 2Go uses 2.4Ghz WiFi and it was widely advertised as such.
I won't cover the same ground again but there were technical reasons for doing this
over the option to use 5Ghz. Whilst I know it is a point of contention and widely discussed
on here we have no plans to change and 2Go will remain as it is.
The addition of the Ethernet port was based on direct feedback from Poly customers and
was added specifically for more demanding applications where you have the option
of higher speed and bandwidth than you will ever achieve using WiFi.
And before you ask yes I realise that using a cable means that 2Go/Hugo2 is no longer
portable but we know that the vast majority of customers use the products in a
fixed location where this is less of an issue and Ethernet just provides an option for
far better connectivity.
Looking at the issue with Roon then I'm afraid that I am not going to talk about it too much on here.
We will work with Roon as we always have done and find out what is going on.
However I can say that using Roon with a streaming service and upsampling at the same time will be using a lot
of bandwidth but from our testing it would appear to be working. I admit you need to
use a high performance router but this should be expected for this sort of more advanced use case.
I personally use an Netgear Orbi RBR50 and if I repeat the same tests you are running on here
it just works. Of course we understand that 2.4Ghz has limited throughput but we are not trying to stream in real time
so potentially it just about the buffering involved but that is a complex issue as it is tied into the requirements
of Roon which we will discuss with them.
For the moment I would encourage you all with this issue to email support@chordelectronics.co.uk so we can collate all the
information together.
Tbh even without upsampling it sounds better than any other DAP. Im just pissed they were not truthful and let us run around like fools wasting precious hours doing our own testing and increasing our frustration quotient. If they had offered a disclaimer from day 1 id be fine knowing that upsampled and higher bit rate over wifi is a crap shoot.For those who are unhappy with your purchase, it is very clear from recent posts that it won't be fixed by software updates. No need to air your grievances here, just return it through the usual channels as unfit for purpose...
Such a shame as it is so pretty
Not gonna happen.So in order to fix the mixed up form factor issue with the mscalar in the chain are we to expect a "2scalar" that fits between the 2go and the Hugo 2 at some point in the near future?
The form factors are presently a mess.Not gonna happen.
however at some point in future I expect chord to launch a portable dac + upsampler. Probably a couple years away. Big opportunity for someone to create a dap that runs hqplayer natively
I admit you need to use a high performance router but this should be expected for this sort of more advanced use case.
I personally use an Netgear Orbi RBR50 and if I repeat the same tests you are running on here
it just works.
Could you please send all the above findings to support@chordelectronics.co.uk and ask for them to confirm or dismiss this and if so provide official feedback on this! Very important we get to hear from them if correct or not so we have the facts.SO a little more detail from my friend on the ROON forum about the crippled 2GO WIFI CHIP. TBH it is false advertising to say this is a hi-rez wireless streamer... if you start to run into trouble at 96 and it's simply incapable of streaming things above 192... CHORD really eff'd this up:
SHORT VERSION (WHICH EXACTLY EXPLAIN ALL THE ISSUES I HAVE HAD WITH 3 DIFFERENT ROUTER SETUPS):
"As you can see, reported by my Meraki (==enterprise oriented kit), the 2Go only supports a single stream. This means real-world it will get max data rate of around 20-25Mb/s, assuming “normal” interference levels and NO OTHER DEVICES on the 2.4Ghz channel consuming anything more than trivial amounts of bandwidth (e.g., your kid firing up a 2.4Ghz-only phone or laptop and streaming Disney+ at the same time will absolutely kill you).192k PCM (~12.5Mb/s) could work, but is likely to have hiccups. 384k PCM (~25Mb/s) MIGHT work when the stars are aligned, but is more or less doomed. And as said before, 768k (~50Mb/s) will never work."
LONG VERSION:
So I saw this got reposted over on head-fi (where I do not have an account), so I figured I’d give a bit more background here. Sorry for the wall of text…
Wi-Fi transmission rates have three main factors that determine their maximum theoretical bandwidth: channel width, symbol encoding scheme, and stream count.
On 2.4Ghz, channel width is fixed at 20Mhz. (Use of 40Mhz on 2.4Ghz is considered very bad neighborship – the only practical real-world usage is for point-to-point wireless links using highly directional antennas such as a Yagi.) On 5Ghz, wider channels are supported. 802.11n supports 40Mhz channels. 802.11ac and 802.11ax support up to 160Mhz channels, but that’s not really practical in the real world, except for the afore-mentioned point-to-point links. Realistic maximum is 80Mhz channels, which can work OK in the home where you typically have low density deployments. (But keep in mind density is not affected by just your kit, but all your surrounding neighbor’s kit as well.) For enterprise, channel width is usually restricted to 40Mhz, and even restricting everything to 20Mhz is quite common for mid to very dense deployments (e.g., stadiums).
The symbol encoding scheme is a variable sliding scale, referred to as the “MCS index”. The client and the AP each exchange frames indicating their max supported index. (The index actually encodes both the symbol encoding scheme and the stream count, but don’t worry about that.) They then attempt to run the connection at the maximum rate both support. For 802.11n, that would be 64-QAM encoding with 400ns guard interval, which gives a max possible rate of 72Mb/s per 20Mhz channel stream. The 2Go advertisement of 65Mb/s indicates it does not support the top MCS index, instead one down. Now, I keep mentioning “theoretical maximum”, because in the real world, you will almost never communicate at maximum signal rate. The client and AP are constantly re-evaluating the symbol encoding selection to maximize bandwidth as RF conditions change. And change they do – constantly. There is so much interference from other devices, wall penetration reducing single strength, multipath reflections, etc., that your average real-world rates are often only half the theoretical max, and can frequently be MUCH WORSE. It’s not uncommon to see single-stream 2.4Ghz top out at around 20-25Mbit/s. And that’s BEFORE you factor in Wi-Fi is a shared, half-duplex medium and other devices on your network will steal time slots, reducing those rates further. Hence why 384k PCM, which is ~25Mb/s continuous, is going to be extremely unreliable on a single-stream 2.4Ghz channel. And the ~50Mb/s of a 768k PCM (or DSD512) is a complete and utter fantasy. (While not directly relevant here, since the 2Go is an 802.11n device, 802.11ac and 802.11ax introduce more modern encoding schemes capable of even higher symbol rates within the same channel width.)
The third factor is concurrent spatial streams. Wi-Fi is based on spread spectrum technology, whereby the signal is sent on different frequencies within the channel using a mapping pattern such that the energy is “spread” across the spectrum. Using multiple radios (each with their own antenna), with their transmissions offset in the spreading pattern from each other, you can effectively transmit multiple “streams” at the same time within a single channel. There are practical limits to this, however, both in physical terms (e.g., you need separate radios, antennas and sufficient power for each stream) and electromagnetic terms (e.g., they will interfere with each other if there is not enough spacing between them, making the signal unrecoverable at the receiver). 802.11n supports a maximum of four streams. In practice, only access points support the full four streams. Most client solutions were 1x1 (single receive, single transmit – typically used for IoT devices and other low bandwidth/cheap/etc devices), 2x1 (2 receive, 1 transmit – typically used by cheap to mid tier laptops where download is more important than upload), 2x2 (high end laptops) or 3x(2|3) (extremely rare, very high end laptops – plus some PCI Wi-Fi cards for fixed-in-place desktops).
image740×1062 70.4 KB
As you can see, reported by my Meraki (==enterprise oriented kit), the 2Go only supports a single stream. This means real-world it will get max data rate of around 20-25Mb/s, assuming “normal” interference levels and NO OTHER DEVICES on the 2.4Ghz channel consuming anything more than trivial amounts of bandwidth (e.g., your kid firing up a 2.4Ghz-only phone or laptop and streaming Disney+ at the same time will absolutely kill you). 192k PCM (~12.5Mb/s) could work, but is likely to have hiccups. 384k PCM (~25Mb/s) MIGHT work when the stars are aligned, but is more or less doomed. And as said before, 768k (~50Mb/s) will never work.
Had Chord used a dual stream solution, the’d have bought the 2Go a bit of headroom such that 384k would probably work most of the time, and 768k could work when the stars are aligned, but would largely be unreliable. Better yet to have used a dual-band, dual stream solution, as then you end up with four times the real-world bandwidth of the solution they implemented. Even 768k could work then, though it would probably still suffer hiccups. Big advantage to 5Ghz is way more channels, too, so much less interference from your neighbors/etc, making for a far more reliable signal (=higher real-world symbol rates on average).