build details

Show: section status errors & todos local changes recent changes last change in-page changes feedback controls

Future improvements

Modified 2020-07-19 by Luzian Bieri

The chapter of benchmarking performance of the Duckiebots is by far not over. This thesis sets a baseline for benchmarking the hardware behaviour. Still all used software plus the applied measurements have room for improvement. Some ideas are presented in this chapter.

Measurements:

Modified 2020-07-19 by Luzian Bieri

  • Collect more data
  • Include localization system data
  • Incorporate Battery measurements
  • Run benchmark using \verb+daffy+ as software release

API:

Modified 2020-07-19 by Luzian Bieri

  • Incorporate other benchmark procedures e.g. Software Benchmark
  • Add authentication via the Duckietown token
  • Upload the data to an external host data storage
  • Cache overall score in order to reduce response time for resp. endpoint
  • Incorporate localization info
  • Use \verb+bjoern+ or similar as production \verb+WSGI+ (Web Server Gateway Interface) server, as the Flask development server is technically not save to run in a production environment.
  • Save graphs, further reducing the response time

CLI

Modified 2020-07-19 by Luzian Bieri

  • Adjust docker digests for \verb+daffy+, resp. \verb+daffy_new_deal+ container once it is released
  • Automated hardware compliance check
  • Invoke callbacks in separate threads
  • Display link to the new diagnostic result

Frontend

Modified 2020-07-19 by Luzian Bieri

  • Improve frontend, enable direct comparison between average and newly supplied benchmark
  • Include frontend into the dashboard
  • Frontend make responsive