The existing RTL TCP driver is quite different from its brother RTL_SDR.
It's much more complicated, uses gr::blocks::deinterleave and
gr::blocks::float_to_complex, and generally doesn't work correctly
(e.g. https://github.com/csete/gqrx/issues/99
Spectrum is mirrored when filter or demodulator changes (rtl_tcp) #99)
I've converted the RTL TCP driver to the model used by RTL_SDR,
simplifying it in the process, and fixing the GQRX issue.
and /etc/gai.conf is not configured to prefer IPv4 hosts.
The current logic handling the output of getaddrinfo() is
flawed in that it only ever attempts to connect() to the
first address returned.
This is a problem for both round-robin and dual-stack hosts.
Furthermore, rtl_tcp_source_c::rtl_tcp_source_c() assumes a colon
in the device string is a port number. This prevents the use
of raw IPv6 addresses. The function will need to be taught how
to handle IPv6 addresses contained within square brackets, e.g.
"rtl_tcp=[2001:db8::1]:1234".
Therefore further work is required to improve the handling of
multiple addresses, and also device strings containing raw IPv6
addresses.
Signed-off-by: Jeremy Visser <jeremy@visser.name>
Several tests have shown that this is the
highest sample rate where no samples
are being dropped on rtl devices.
Signed-off-by: Steve Markgraf <steve@steve-m.de>
received from Juha Vierinen:
A student here noticed that there is dc bias even with the rafael tuner.
We looked into this issue and found that using 127.4f instead of 127.5f
removes this bias. I assume this is associated with a bug in the digital
downconversion of the RTL chip. This change fixes the problem.
For use with the rtl_tcp utility acting as a spectrum server.
The "empty" rtl_tcp= device hint might be used to connect to rtl_tcp
running on local machine.