Your browser does not seem to support JavaScript. As a result, your viewing experience will be diminished, and you have been placed in read-only mode.
Please download a browser that supports JavaScript, or enable it if it's disabled (i.e. NoScript).
For my job it would be handy if the software could autodetect the serial baudrate.
Since you have developped the serial tap you could perhaps use the minimum time between two flanks on RX or TX to assume a used baudrate.
This would be for the situation that I connect to a device using serial and I know its sends data without being given a certain command (e.g. ttl on raspberry pi, marlin boards). For now I always have to change baudrates manually and check untill it results in readable text being displayed. For some devices that don't use very standard baudrates (9600, 115200, ...) it would save me quite some time
Thanks for the feedback!
Auto-baud rate could be convenient, but when used in a sniffer, there's a quirk -- detecting baud rate can't be 100% reliable without prior knowledge of the incoming data stream. Serial protocols that support auto-baud rate always use specific packet headers that allow reliable detection of the baud rate for this exact reason. So, a sniffer with auto-baud rate detection (which can't make any assumptions about the underlying data) could initially produce incorrect bytes before it deduces the actual baud rate.
All that said, yes, we do consider adding the automatic baud rate detection in the new generation of our Serial Tap devices.