Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pi4 potential? #7

Open
paulhothersall opened this issue Feb 21, 2021 · 38 comments
Open

Pi4 potential? #7

paulhothersall opened this issue Feb 21, 2021 · 38 comments

Comments

@paulhothersall
Copy link

The USB-C connection has full OTG support, and the hardware is "reasonable" with 2,4,8GB ram options.

Have hardware, and happy to help support / test

@tomasz-grobelny
Copy link
Owner

tomasz-grobelny commented Feb 21, 2021

RPi4 indeed should work. Great, please follow the steps from doc/INSTALL.md file. I do not expect any major differences, except maybe getting kernel 5.4.x+ ready. Please report any questions/issues encountered here and I will do my best to help. When actually running AACS openauto will be useful for initial testing so please have it ready on your PC.

@paulhothersall
Copy link
Author

I have openauto PRO (licensed) on a pi3, and also crankshaft.

Both of those work as head units from a stock pixel setup

starting setting up with a ubuntu 20.10 arm64b server first. I will fallback to raspbian (aka raspberry pi OS) if i have issues with the USB OTG part.

@tomasz-grobelny
Copy link
Owner

First basic check would be to verify existence of /sys/kernel/config/usb_gadget and contents of /sys/class/udc after libcomposite kernel module is loaded.

@paulhothersall
Copy link
Author

paulhothersall commented Feb 21, 2021

the Pi4 2GB Ram version REALLY doesn't like building anbox! I could make a coffee on the CPU. Even the uSD card gets hot from IO thrashing, so dropping that back to single thread building

@tomasz-grobelny
Copy link
Owner

To test AACS alone you could skip anbox for now - this one is used only for running OsmAnd navigation.

@paulhothersall
Copy link
Author

good point,

a lightweight "video stream only" --> head unit is a thought I have right now
possible by having the hardware encoder on the pi4 even taking the framebuffer (some optimisation on the GST chain to not then software decode/size/encode)

@paulhothersall
Copy link
Author

/home/ubuntu/AA/AACS/AAServer/include/InputChannelHandler.h:14:8: error: ‘set’ in namespace ‘std’ does not name a template type
14 | std::set registered_clients;
| ^~~
/home/ubuntu/AA/AACS/AAServer/include/InputChannelHandler.h:6:1: note: ‘std::set’ is defined in header ‘’; did you forget to ‘#include ’?
5 | #include "ChannelHandler.h"
+++ |+#include
6 | #include
[ 89%] Building CXX object AAServer/CMakeFiles/AAServer.dir/MediaChannelSetupResponse.pb.cc.o
[ 89%] Building CXX object AAServer/CMakeFiles/AAServer.dir/Channel.pb.cc.o
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp: In member function ‘void InputChannelHandler::sendHandshakeRequest()’:
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp:29:43: warning: ‘int google::protobuf::MessageLite::ByteSize() const’ is deprecated: Please use ByteSizeLong() instead [-Wdeprecated-declarations]
29 | int bufSize = handshakeRequest.ByteSize();
| ^
In file included from /usr/include/google/protobuf/generated_enum_util.h:36,
from /usr/include/google/protobuf/map.h:55,
from /usr/include/google/protobuf/generated_message_table_driven.h:34,
from /home/ubuntu/AA/AACS/build/AAServer/InputChannel.pb.h:26,
from /home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp:4:
/usr/include/google/protobuf/message_lite.h:420:7: note: declared here
420 | int ByteSize() const { return internal::ToIntSize(ByteSizeLong()); }
| ^~~~~~~~
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp: In member function ‘virtual bool InputChannelHandler::handleMessageFromHeadunit(const Message&)’:
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp:56:24: error: ‘registered_clients’ was not declared in this scope
56 | for (auto &&rc : registered_clients)
| ^~~~~~~~~~~~~~~~~~
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp: In member function ‘virtual bool InputChannelHandler::handleMessageFromClient(int, uint8_t, bool, const std::vector&)’:
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp:69:3: error: ‘registered_clients’ was not declared in this scope
69 | registered_clients.insert(clientId);
| ^~~~~~~~~~~~~~~~~~
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp: In member function ‘virtual void InputChannelHandler::disconnected(int)’:
/home/ubuntu/AA/AACS/AAServer/src/InputChannelHandler.cpp:78:3: error: ‘registered_clients’ was not declared in this scope
78 | registered_clients.erase(clientId);
| ^~~~~~~~~~~~~~~~~~

@tomasz-grobelny
Copy link
Owner

The error message is quite informative :-) Try adding #include <set> in InputChannelHandler.h - for some reason for me it works without this one, but it should be there.

@paulhothersall
Copy link
Author

#include

worked, and RealWorld™ in the way of any more setup for next few hours. will let you know how I get on later

@paulhothersall
Copy link
Author

paulhothersall commented Feb 22, 2021

ok, ubuntu and OTG stuff seems "interesting". So AAServer can't find USB to use etc

Using raspberry pi OS seems easier for that, however, I can't build with cmake

-- Checking for module 'gstreamer-base-1.0'
-- Found gstreamer-base-1.0, version 1.14.4
-- Checking for module 'libusbgx'
-- Found libusbgx, version 0.2.0
-- Checking for module 'libpcap'
-- No package 'libpcap' found
CMake Error at /usr/share/cmake-3.13/Modules/FindPkgConfig.cmake:452 (message):
A required package was not found
Call Stack (most recent call first):
/usr/share/cmake-3.13/Modules/FindPkgConfig.cmake:622 (_pkg_check_modules_internal)
CMakeLists.txt:13 (pkg_check_modules)

I know libpcap (inc dev) is installed

ls /usr/lib/arm-linux-gnueabihf/ | grep pcap

libpcap.a
libpcap.so
libpcap.so.0.8
libpcap.so.1.8.1

build and install from source works ..

will try testing tomorrow

@tomasz-grobelny
Copy link
Owner

Looks to me like pkgconfig file might be missing:

# dpkg -S  `find /usr -name libpcap.pc`
libpcap0.8-dev:arm64: /usr/lib/aarch64-linux-gnu/pkgconfig/libpcap.pc
#

I would call it libpcap packaging issue on your system.

@paulhothersall
Copy link
Author

Yeah. It will pass that with a manual libpcap build . The base raspberry Pi OS is still 32bit !

@paulhothersall
Copy link
Author

hmm

/AACS/build/AAServer $ sudo ./AAServer

Got 51, write=2
Got 51, write=2
Got 51, write=2
Got some info: 0=
Got some info: 0=
Got some info: 0=
Got some info: 1=Android Auto
Got some info: 1=Android Auto
Got some info: 1=Android Auto
Got some info: 2=Android Auto
Got some info: 2=Android Auto
Got some info: 2=Android Auto
Got some info: 3=
Got some info: 3=
Got some info: 3=
Got some info: 4=https://f1xstudio.com
Got some info: 4=https://f1xstudio.com
Got some info: 4=https://f1xstudio.com
Got some info: 5=HU-AAAAAA001
Got some info: 5=HU-AAAAAA001
Got some info: 5=HU-AAAAAA001
Got 53, exit
DefaultChannelHandler: 0
dumpfile:
ep0 event 0
ep0 event 2
got version request
version negotiation ok
auth complete
got service discovery response
channels {
channel_id: 7
media_input_channel {
stream_type: Audio
audio_config {
sample_rate: 16000
bits_per_sample: 16
channel_count: 1
}
}
}
channels {
channel_id: 6
media_channel {
media_type: Audio
2: 2
3 {
1: 16000
2: 16
3: 1
}
5: 1
}
}
channels {
channel_id: 1
input_channel {
available_buttons: ENTER
available_buttons: LEFT
available_buttons: RIGHT
available_buttons: UP
available_buttons: DOWN
available_buttons: BACK
available_buttons: HOME
available_buttons: PHONE
available_buttons: CALL_END
available_buttons: PLAY
available_buttons: PAUSE
available_buttons: PREV
available_buttons: TOGGLE_PLAY
available_buttons: NEXT
available_buttons: MICROPHONE_1
available_buttons: SCROLL_WHEEL
available_buttons: SCROLL_WHEEL
screen_config {
width: 800
height: 452
}
1: 65537
1: 65538
1: 65540
}
}
channels {
channel_id: 2
sensor_channel {
sensors {
type: DrivingStatus
}
sensors {
type: NightData
}
sensors {
type: Location
}
}
}
channels {
channel_id: 3
media_channel {
media_type: Video
4 {
1: 1
2: 2
5: 140
}
5: 1
}
}
2: "OpenAuto Pro"
3: "Universal"
4: "2018"
5: "20180301"
6: 1
7: "BlueWave Studio"
8: "OpenAuto Pro Autoapp"
9: "1"
10: "1.0"
11: 1

DefaultChannelHandler: 7
DefaultChannelHandler: 6
InputChannelHandler: 1
DefaultChannelHandler: 2
VideoChannelHandler: 3
Error: Unhandled message type: 11

@tomasz-grobelny
Copy link
Owner

Looks like unimplemented feature. 11 is MessageType::PingRequest sent by headunit and AACS should probably send PingResponse. In my setup the headunit did not send ping request so I have no way to test. Therefore this is a first opportunity to implement a feature in AAServer :-) Shouldn't be too complicated, best to start with sniffing transmission between your Openauto headunit and phone to see how the phone replies. Or analyze openauto source code to see what it expects.

@paulhothersall
Copy link
Author

I have an idea, I can setup a couple of pi4, 1 with ubuntu arm64, and the other with raspberry pi os ( armv7 / armmhf / 32 bit) such they can be remote connected to for any remote debugging ?

and given the 32 bit seems to work, at least before the last error of unexpected messageType, I can throw a piZeroW in as well. whilst a (very) slow CPU, its hot hardware h264 decode and encode to take for example a cast or sinked video source to it to onward send (transcode) to the head unit. so airplay / chromecast for media could both be options!

@samehhady
Copy link

very interesting project, I was wondering if I can test on PI zero, got one in hand and ready to play with it.

Is there some sort of image that I can use directly?

@samehhady
Copy link

Would love to see the posibility to open web page on the screen, lots of ideas can be implemented in that case

@tomasz-grobelny
Copy link
Owner

I have an idea, I can setup a couple of pi4, 1 with ubuntu arm64, and the other with raspberry pi os ( armv7 / armmhf / 32 bit) such they can be remote connected to for any remote debugging ?

You could try. I have some doubts how this would work for remote debugging given there will be no possibility to disconnect USB cable or reset/see/interact with headunit - or would that be an option as well?

@tomasz-grobelny
Copy link
Owner

Would love to see the posibility to open web page on the screen, lots of ideas can be implemented in that case

My idea was to go for electronjs app, I even started fiddling with it (do not expect anything end-user ready anytime soon). It would also be great to have possibility to launch new apps and switch between apps dynamically - that however requires kind of 'gstreamer router' - I haven't investigated that yet though.

@paulhothersall
Copy link
Author

I have an idea, I can setup a couple of pi4, 1 with ubuntu arm64, and the other with raspberry pi os ( armv7 / armmhf / 32 bit) such they can be remote connected to for any remote debugging ?

You could try. I have some doubts how this would work for remote debugging given there will be no possibility to disconnect USB cable or reset/see/interact with headunit - or would that be an option as well?

I can easily have HDMI capture connected back to the USB , so it's easy to frame grab or restream what's going on, even remotely.

And at least on the primary usb A ports on the raspberry Pi, possible on all, to disable and re enable data. Essentially the same as physically inserting cables.

With a small caveat on one particular port that you have to disable the whole hub only, it can be done on an individual port level.

@paulhothersall
Copy link
Author

Would love to see the posibility to open web page on the screen, lots of ideas can be implemented in that case

My idea was to go for electronjs app, I even started fiddling with it (do not expect anything end-user ready anytime soon). It would also be great to have possibility to launch new apps and switch between apps dynamically - that however requires kind of 'gstreamer router' - I haven't investigated that yet though.

Depending on the device,, extracting the GPU framebuffer maybe "all" that's needed.

The pi is quite capable of doing it, so whatever is rendered to it's (headless) concept of the display can be forwarded to the head unit (GStreamer)

I should have some time later to do a setup. I could message you directly with details of login etc etc for it?

@tomasz-grobelny
Copy link
Owner

Depending on the device,, extracting the GPU framebuffer maybe "all" that's needed.

I bet it is not very complicated - it just requires time for someone to test/implement/describe it :-) For sure it should be done outside AAServer and I am wondering whether some solution already exists. Currently window contents of anbox+OsmAnd is sent to AAServer using ximagesrc plugin. What need to be done is an app that would dynamically switch video streams sent to AAServer and direct events received from AAServer to correct application.

As for login data - sure, my mail is publicly available on github.

@adrianalin
Copy link

Hello @tomasz-grobelny , i managed to setup openauto on my linux laptop in order to test AAC. I can run autoapp with QtCreator, and first i wanted to try openauto with my phone. When connecting my phone to laptop via USB, video stream cannot start and i get:

Warning: "No decoder available for type 'video/x-h264, stream-format=(string)byte-stream'."
Error: "Your GStreamer installation is missing a plug-in."
appsrc: push buffer wrong state
appsrc: push buffer wrong state
appsrc: push buffer wrong state
appsrc: push buffer wrong state
...

Did you encountered this error as well? If so how did you fixed it?

@tomasz-grobelny
Copy link
Owner

I do not recall anything like that, but my first thought would be to install all available packages with gstreamer plugins on your system (good, bad, ugly, etc.).

@adrianalin
Copy link

Thank you, got it working after i installed gst-plugins-bad.

@adrianalin
Copy link

adrianalin commented Mar 7, 2021

Hello, I managed to build a yocto image for Raspberry Pi4 which run AAServer. You can check it out here https://github.com/adrianalin/meta-aacs

@tomasz-grobelny
Copy link
Owner

That's great :-) I was thinking of some kind of build image automation, but wasn't aware of yocto. Do you think it makes sense to have the yocto build definition in AACS repository or are you planning to maintain separate repository with AACS image definition? Also, I see you disabled GetEvents - any particular reason for that? X11 dependency? Any plans to work on that in the future?

Getting back to reading about yocto and trying your work.

@tomasz-grobelny
Copy link
Owner

tomasz-grobelny commented Mar 7, 2021

$ git remote -v
origin  https://github.com/adrianalin/csng_yocto.git (fetch)
origin  https://github.com/adrianalin/csng_yocto.git (push)
$ git branch -v
* aacs-image 63911fc Add image for building aacs.
  main       aec3884 Local audio on jack;
$ bitbake aacs-image
ERROR: no recipe files to build, check your BBPATH and BBFILES?
NOTE: Cache: default: Not using a cache. Set CACHE = <directory> to enable.
Loading cache: 100% |                                                                                                 | ETA:  --:--:--
Loaded 0 entries from dependency cache.
ERROR: Nothing PROVIDES 'aacs-image'

Summary: There were 2 ERROR messages shown, returning a non-zero exit code.
$ cd images/
$ bitbake aacs-image
NOTE: Cache: default: Not using a cache. Set CACHE = <directory> to enable.
Loading cache: 100% |                                                                                                 | ETA:  --:--:--
Loaded 0 entries from dependency cache.
Parsing recipes: 100% |################################################################################################| Time: 0:00:00
Parsing of 4 .bb files complete (0 cached, 4 parsed). 4 targets, 0 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Initialising tasks: 100% |#############################################################################################| Time: 0:00:00
NOTE: No setscene tasks
NOTE: Executing Tasks
NOTE: Tasks Summary: Attempted 1 tasks of which 0 didn't need to be rerun and all succeeded.
$

Being new to yocto I am probably missing something simple.

UPDATE: reading more and more about yocto there is some progress with the build as well...

@adrianalin
Copy link

That's great :-) I was thinking of some kind of build image automation, but wasn't aware of yocto. Do you think it makes sense to have the yocto build definition in AACS repository or are you planning to maintain separate repository with AACS image definition? Also, I see you disabled GetEvents - any particular reason for that? X11 dependency? Any plans to work on that in the future?

Getting back to reading about yocto and trying your work.

Regarding separate repo i'm not sure i can maintain one for long time. Anyway the yocto layer (which will contain the AACS recipe) will have to be separate from the source code, not in the AACS repository.

Will try to build with X11 as well, and enable back GstEvents.

@adrianalin
Copy link

$ git remote -v
origin  https://github.com/adrianalin/csng_yocto.git (fetch)
origin  https://github.com/adrianalin/csng_yocto.git (push)
$ git branch -v
* aacs-image 63911fc Add image for building aacs.
  main       aec3884 Local audio on jack;
$ bitbake aacs-image
ERROR: no recipe files to build, check your BBPATH and BBFILES?
NOTE: Cache: default: Not using a cache. Set CACHE = <directory> to enable.
Loading cache: 100% |                                                                                                 | ETA:  --:--:--
Loaded 0 entries from dependency cache.
ERROR: Nothing PROVIDES 'aacs-image'

Summary: There were 2 ERROR messages shown, returning a non-zero exit code.
$ cd images/
$ bitbake aacs-image
NOTE: Cache: default: Not using a cache. Set CACHE = <directory> to enable.
Loading cache: 100% |                                                                                                 | ETA:  --:--:--
Loaded 0 entries from dependency cache.
Parsing recipes: 100% |################################################################################################| Time: 0:00:00
Parsing of 4 .bb files complete (0 cached, 4 parsed). 4 targets, 0 skipped, 0 masked, 0 errors.
NOTE: Resolving any missing task queue dependencies
Initialising tasks: 100% |#############################################################################################| Time: 0:00:00
NOTE: No setscene tasks
NOTE: Executing Tasks
NOTE: Tasks Summary: Attempted 1 tasks of which 0 didn't need to be rerun and all succeeded.
$

Being new to yocto I am probably missing something simple.

UPDATE: reading more and more about yocto there is some progress with the build as well...

Be aware that a yocto build may take quite a long time.
The repo i prepared is actually just a layer for yocto, so you need to download all the other layers and prepare the layer directories yourself (see bblayers.conf).
After that you also need to source poky/oe-init-build-env build/

This is how my layers directories look on my side (notice csng_yocto layer which contains the aacs recipes):

image

@adrianalin
Copy link

I also started to clean-up the layer for aacs, and probably will put it in a separate repo as you suggested (now there are all kinds of recipes related to openauto/crankshaft).

@adrianalin
Copy link

adrianalin commented Mar 8, 2021

I managed to create a completely separate repo dedicated to meta-aacs https://github.com/adrianalin/meta-aacs :) (will try to maintain it as long as i can). Next will try to enable GetEvents.

@tomasz-grobelny
Copy link
Owner

The build is almost halfway through - it will take another several hours it seems. If it succeeds I would have a RPi4 build that I can do nothing with :-) Would it be difficult to rebuild it for Odroid N2? Seems like yocto layers for the hardware should exist (meta-odroid?, meta-meson?).

Also do you know if it would be feasible to do automatic builds on some public service (github?) and publish the builds somewhere?

@adrianalin
Copy link

adrianalin commented Mar 8, 2021

The build is almost halfway through - it will take another several hours it seems. If it succeeds I would have a RPi4 build that I can do nothing with :-) Would it be difficult to rebuild it for Odroid N2? Seems like yocto layers for the hardware should exist (meta-odroid?, meta-meson?).

It should be quite easy to rebuild for odroid, you are right, you need the meta-odroid instead of meta-raspberrypi. You may need to start the build all over (lots of time spent again). Don't forget to change the MACHINE in local.conf .

Also do you know if it would be feasible to do automatic builds on some public service (github?) and publish the builds somewhere?

This would be nice indeed but i didn't do it untill now. Do let me know if there is any way this can be done.

Update: here https://github.com/akuster/meta-odroid seems to be a
.gitlab-ci.yml which may be doing exactly that.

@tomasz-grobelny
Copy link
Owner

The build is almost halfway through - it will take another several hours it seems. If it succeeds I would have a RPi4 build that I can do nothing with :-) Would it be difficult to rebuild it for Odroid N2? Seems like yocto layers for the hardware should exist (meta-odroid?, meta-meson?).

It should be quite easy to rebuild for odroid, you are right, you need the meta-odroid instead of meta-raspberrypi. You may need to start the build all over (lots of time spent again). Don't forget to change the MACHINE in local.conf .

Managed to build for odroid-n2, system starts and I can log in. Kernel is 5.10, but it is missing dwc2 and libcomposite kernel modules (and possibly others related). Will need to investigate how to add them (hints welcome).

Also do you know if it would be feasible to do automatic builds on some public service (github?) and publish the builds somewhere?

This would be nice indeed but i didn't do it untill now. Do let me know if there is any way this can be done.

Unfortunately I am not aware of any such service available freely (that would require quite some resources). Anyway, local build is good enough for now.

Update: here https://github.com/akuster/meta-odroid seems to be a
.gitlab-ci.yml which may be doing exactly that.

Good start, but I understand I would need to setup my own gitlab instance, right?

I am also wondering how my changes could be merged into your repo - I had to remove recipes-bsp/bootfiles/rpi-config_git.bbappend (does not seem applicable for Odroid) and changes in local.conf (different usernames for odroid and rpi). But that's after I get the base thing running (see kernel modules above).

@adrianalin
Copy link

adrianalin commented Mar 11, 2021

For enabling dwc2 (USB_DWC2) and libcomposite (USB_LIBCOMPOSITE) you may need to run bitbake -c menuconfig virtual/kernel like mentioned here http://variwiki.com/index.php?title=Yocto_Customizing_the_Linux_kernel and look for the options in menuconfig.

Regarding the merging of the changes, i don't know for sure. Maybe separate branch? Indeed there are some recipes like the one you mentioned should not apply to odroid.

@tomasz-grobelny
Copy link
Owner

I don't think branches are the way to go in this case - I made a pull request to your repo, please have a look.

@adrianalin
Copy link

Yes, you are right. looking on it now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants