Building package "gpt4all-chat-git" INFO: Starting build... INFO: Verifying bootstrap image /home/u726578/chaotic/cache/lower/20240502104153.sif WARNING: integrity: signature not found for object group 1 WARNING: Bootstrap image could not be verified, but build will continue. INFO: Creating sandbox directory... INFO: Build complete: /scratch/chaotic/sandbox/pkg47c6c889e25 :: Synchronizing package databases... core downloading... extra downloading... multilib downloading... chaotic-aur downloading... :: Starting full system upgrade... resolving dependencies... looking for conflicting packages... Packages (5) jansson-2.14-4 perl-error-0.17029-5 perl-mailtools-2.21-7 perl-timedate-2.33-5 git-2.45.0-1 Total Installed Size: 27.96 MiB Net Upgrade Size: 27.77 MiB :: Proceed with installation? [Y/n] checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing perl-error... installing perl-timedate... installing perl-mailtools... installing git... Optional dependencies for git tk: gitk and git gui openssh: ssh transport and crypto perl-libwww: git svn perl-term-readkey: git svn and interactive.singlekey setting perl-io-socket-ssl: git send-email TLS support perl-authen-sasl: git send-email TLS support perl-mediawiki-api: git mediawiki support perl-datetime-format-iso8601: git mediawiki support perl-lwp-protocol-https: git mediawiki https support perl-cgi: gitweb (web interface) support python: git svn & git p4 subversion: git svn org.freedesktop.secrets: keyring credential helper libsecret: libsecret credential helper [installed] upgrading jansson... :: Running post-transaction hooks... (1/4) Creating system user accounts... Creating group 'git' with GID 973. Creating user 'git' (git daemon user) with UID 973 and GID 973. (2/4) Reloading system manager configuration... Skipped: Current root is not booted. (3/4) Arming ConditionNeedsUpdate... (4/4) Warn about old perl modules warning: git-2.45.0-1 is up to date -- skipping resolving dependencies... :: There are 2 providers available for libgl: :: Repository extra 1) libglvnd :: Repository chaotic-aur 2) nvidia-340xx-utils Enter a number (default=1): :: There are 2 providers available for jack: :: Repository extra 1) jack2 2) pipewire-jack Enter a number (default=1): :: There are 2 providers available for libzimg.so=2-64: :: Repository extra 1) zimg :: Repository chaotic-aur 2) zimg-git Enter a number (default=1): :: There are 12 providers available for ttf-font: :: Repository extra 1) gnu-free-fonts 2) noto-fonts 3) ttf-bitstream-vera 4) ttf-croscore 5) ttf-dejavu 6) ttf-droid 7) ttf-ibm-plex 8) ttf-input 9) ttf-liberation 10) ttf-mona-sans :: Repository chaotic-aur 11) apple-fonts 12) ttf-ms-fonts Enter a number (default=1): looking for conflicting packages... warning: dependency cycle detected: warning: harfbuzz will be installed before its freetype2 dependency warning: dependency cycle detected: warning: mesa will be installed before its libglvnd dependency Packages (179) alsa-lib-1.2.11-1 alsa-topology-conf-1.2.5.1-3 alsa-ucm-conf-1.2.11-1 aom-3.9.0-1 avahi-1:0.8+r194+g3f79789-2 cairo-1.18.0-2 cppdap-1.58.0-1 dav1d-1.4.1-1 default-cursors-2-1 double-conversion-3.3.0-1 duktape-2.7.0-6 ffmpeg-2:6.1.1-7 fftw-3.3.10-7 flac-1.4.3-1 fontconfig-2:2.15.0-2 freetype2-2.13.2-1 fribidi-1.0.14-1 gdk-pixbuf2-2.42.11-2 giflib-5.2.2-1 glslang-14.0.0-2 gnu-free-fonts-20120503-8 gperftools-2.15-1 graphite-1:1.3.14-3 gsm-1.0.22-1 harfbuzz-8.4.0-1 hicolor-icon-theme-0.17-3 hidapi-0.14.0-2 highway-1.1.0-1 imath-3.1.11-2 jack2-1.9.22-1 jbigkit-2.1-7 jsoncpp-1.9.5-2 l-smash-2.14.5-3 lame-3.100-4 lcms2-2.16-1 libass-0.17.1-4 libasyncns-1:0.8+r3+g68cd5af-2 libavc1394-0.5.4-6 libb2-0.98.1-2 libbluray-1.3.4-1 libbs2b-3.1.0-8 libcups-1:2.4.8-1 libdaemon-0.14-5 libdatrie-0.2.13-4 libdeflate-1.20-1 libdovi-3.3.0-1 libdrm-2.4.120-1 libedit-20230828_3.1-1 libevdev-1.13.1-1 libglvnd-1.7.0-1 libgudev-238-1 libice-1.1.1-2 libiec61883-1.2.0-7 libinput-1.25.0-1 libjpeg-turbo-3.0.2-2 libjxl-0.10.2-1 libmodplug-0.8.9.0-5 libogg-1.3.5-1 libomxil-bellagio-0.9.3-4 libopenmpt-0.7.6-2 libpciaccess-0.18.1-2 libplacebo-6.338.2-6 libpng-1.6.43-1 libproxy-0.5.6-1 libpulse-17.0-3 libraw1394-2.1.2-3 librsvg-2:2.58.0-1 libsamplerate-0.2.2-2 libsm-1.2.4-1 libsndfile-1.2.2-2 libsoxr-0.1.3-3 libssh-0.10.6-2 libthai-0.1.29-3 libtheora-1.1.1-6 libtiff-4.6.0-4 libunibreak-6.1-1 libunwind-1.8.1-2 libuv-1.48.0-2 libva-2.21.0-1 libvdpau-1.5-2 libvorbis-1.3.7-3 libvpl-2.10.2-1 libvpx-1.14.0-1 libwacom-2.11.0-1 libwebp-1.4.0-1 libx11-1.8.9-1 libxau-1.0.11-2 libxcb-1.17.0-1 libxcomposite-0.4.6-1 libxcursor-1.2.2-1 libxdamage-1.1.6-1 libxdmcp-1.1.5-1 libxext-1.3.6-1 libxfixes-6.0.1-1 libxft-2.3.8-1 libxi-1.8.1-1 libxkbcommon-1.7.0-2 libxkbcommon-x11-1.7.0-2 libxkbfile-1.1.3-1 libxmu-1.2.1-1 libxrandr-1.5.4-1 libxrender-0.9.11-1 libxshmfence-1.3.2-1 libxslt-1.1.39-2 libxt-1.3.0-1 libxtst-1.2.4-1 libxv-1.0.12-1 libxxf86vm-1.1.5-1 llvm-libs-17.0.6-4 lm_sensors-1:3.6.0.r41.g31d1f125-2 lzo-2.10-5 md4c-0.5.2-1 mesa-1:24.0.6-2 minizip-1:1.3.1-1 mpdecimal-4.0.0-2 mpg123-1.32.5-1 mtdev-1.1.6-2 nspr-4.35-2 nss-3.99-1 ocl-icd-2.3.2-1 opencore-amr-0.1.6-1 openexr-3.2.4-1 openjpeg2-2.5.2-1 opus-1.5.2-1 pango-1:1.52.2-1 pixman-0.43.4-1 portaudio-1:19.7.0-2 qt6-positioning-6.7.0-2 qt6-translations-6.7.0-1 qt6-webchannel-6.7.0-1 qt6-websockets-6.7.0-1 rav1e-0.7.1-1 rhash-1.4.4-1 rubberband-3.3.0-1 sdl2-2.30.2-1 shared-mime-info-2.4-1 snappy-1.1.10-1 speex-1.2.1-1 speexdsp-1.2.1-1 spirv-tools-2023.6-1 srt-1.5.3-1 svt-av1-2.0.0-1 tslib-1.23-1 v4l-utils-1.26.1-1 vapoursynth-R66-2 vid.stab-1.1.1-1 vmaf-3.0.0-1 vulkan-headers-1:1.3.279-1 vulkan-icd-loader-1.3.279-1 wayland-1.22.0-1 x264-3:0.164.r3108.31e19f9-1 x265-3.5-3 xcb-proto-1.17.0-2 xcb-util-0.4.1-1 xcb-util-cursor-0.1.5-1 xcb-util-image-0.4.1-2 xcb-util-keysyms-0.4.1-4 xcb-util-renderutil-0.3.10-1 xcb-util-wm-0.4.2-1 xdg-utils-1.2.1-1 xkeyboard-config-2.41-1 xorg-xprop-1.2.7-1 xorg-xset-1.2.5-1 xorgproto-2024.1-2 xvidcore-1.3.7-2 xxhash-0.8.2-1 zimg-3.0.5-1 cmake-3.29.2-1 python-3.12.3-1 qt6-5compat-6.7.0-1 qt6-base-6.7.0-3 qt6-declarative-6.7.0-1 qt6-httpserver-6.7.0-1 qt6-shadertools-6.7.0-1 qt6-svg-6.7.0-1 qt6-wayland-6.7.0-1 qt6-webengine-6.7.0-1 shaderc-2023.8-1 vulkan-tools-1.3.269-1 Total Download Size: 79.30 MiB Total Installed Size: 1148.96 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... qt6-webengine-6.7.0-1-x86_64 downloading... qt6-positioning-6.7.0-2-x86_64 downloading... vulkan-tools-1.3.269-1-x86_64 downloading... qt6-webchannel-6.7.0-1-x86_64 downloading... qt6-websockets-6.7.0-1-x86_64 downloading... qt6-httpserver-6.7.0-1-x86_64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... checking available disk space... :: Processing package changes... installing double-conversion... installing libpng... installing graphite... Optional dependencies for graphite graphite-docs: Documentation installing harfbuzz... Optional dependencies for harfbuzz harfbuzz-utils: utilities installing freetype2... installing fontconfig... Creating fontconfig configuration... Rebuilding fontconfig cache... installing libb2... installing libdaemon... installing avahi... Optional dependencies for avahi gtk3: avahi-discover, avahi-discover-standalone, bshell, bssh, bvnc libevent: libevent bindings [installed] nss-mdns: NSS support for mDNS python-dbus: avahi-bookmarks, avahi-discover python-gobject: avahi-bookmarks, avahi-discover python-twisted: avahi-bookmarks qt5-base: qt5 bindings installing libcups... installing libpciaccess... installing libdrm... Optional dependencies for libdrm cairo: needed for modetest tool [pending] installing xcb-proto... installing xorgproto... installing libxdmcp... installing libxau... installing libxcb... installing libx11... installing libxext... installing libxfixes... installing libxshmfence... installing libxxf86vm... installing libedit... installing llvm-libs... installing lm_sensors... Optional dependencies for lm_sensors rrdtool: for logging with sensord perl: for sensor detection and configuration convert [installed] installing default-cursors... Optional dependencies for default-cursors adwaita-cursors: default cursor theme installing wayland... installing libomxil-bellagio... installing mesa... Optional dependencies for mesa opengl-man-pages: for the OpenGL API man pages installing libglvnd... installing libice... installing mtdev... installing libevdev... installing libgudev... installing libwacom... Optional dependencies for libwacom python-libevdev: for libwacom-show-stylus python-pyudev: for libwacom-show-stylus installing libinput... Optional dependencies for libinput gtk4: libinput debug-gui python-pyudev: libinput measure python-libevdev: libinput measure python-yaml: used by various tools installing libjpeg-turbo... Optional dependencies for libjpeg-turbo java-runtime>11: for TurboJPEG Java wrapper installing duktape... installing libproxy... installing libsm... installing xkeyboard-config... installing libxkbcommon... Optional dependencies for libxkbcommon libxkbcommon-x11: xkbcli interactive-x11 [pending] wayland: xkbcli interactive-wayland [installed] installing libxkbcommon-x11... installing md4c... installing shared-mime-info... installing tslib... installing vulkan-headers... installing xcb-util-renderutil... installing xcb-util... installing xcb-util-image... installing xcb-util-cursor... installing xcb-util-keysyms... installing xcb-util-wm... installing libxt... installing libxmu... installing xorg-xset... installing xorg-xprop... installing xdg-utils... Optional dependencies for xdg-utils kde-cli-tools: for KDE Plasma5 support in xdg-open exo: for Xfce support in xdg-open pcmanfm: for LXDE support in xdg-open perl-file-mimeinfo: for generic support in xdg-open perl-net-dbus: Perl extension to dbus used in xdg-screensaver perl-x11-protocol: Perl X11 protocol used in xdg-screensaver installing qt6-translations... installing qt6-base... Optional dependencies for qt6-base freetds: MS SQL driver gdk-pixbuf2: GTK platform plugin [pending] gtk3: GTK platform plugin libfbclient: Firebird/iBase driver mariadb-libs: MariaDB driver pango: GTK platform plugin [pending] perl: for syncqt [installed] postgresql-libs: PostgreSQL driver qt6-wayland: to run Qt6 applications in a Wayland session [pending] unixodbc: ODBC driver installing qt6-websockets... Optional dependencies for qt6-websockets qt6-declarative: QML bindings [pending] installing qt6-httpserver... installing qt6-declarative... Optional dependencies for qt6-declarative qt6-languageserver: for qmlls installing qt6-svg... installing alsa-topology-conf... installing alsa-ucm-conf... installing alsa-lib... installing aom... installing libxrender... installing lzo... installing pixman... installing cairo... installing dav1d... Optional dependencies for dav1d dav1d-doc: HTML documentation installing fribidi... installing gsm... installing libsamplerate... installing opus... installing jack2... Optional dependencies for jack2 a2jmidid: for ALSA MIDI to JACK MIDI bridging libffado: for firewire support using FFADO jack-example-tools: for official JACK example-clients and tools jack2-dbus: for dbus integration jack2-docs: for developer documentation realtime-privileges: for realtime privileges installing lame... installing libunibreak... installing libass... installing libraw1394... installing libavc1394... installing libbluray... Optional dependencies for libbluray java-runtime: BD-J library installing libogg... installing flac... installing libvorbis... installing mpg123... Optional dependencies for mpg123 sdl2: for sdl audio support [pending] jack: for jack audio support [installed] libpulse: for pulse audio support [pending] perl: for conplay [installed] installing libsndfile... Optional dependencies for libsndfile alsa-lib: for sndfile-play [installed] installing libbs2b... installing libiec61883... installing giflib... installing libunwind... installing gperftools... Optional dependencies for gperftools graphviz: pprof graph generation perl: pprof and pprof-symbolize commands [installed] installing highway... installing imath... Optional dependencies for imath boost-libs: python bindings python: python bindings [pending] installing libdeflate... installing openexr... installing libjxl... Optional dependencies for libjxl gdk-pixbuf2: for gdk-pixbuf loader [pending] gimp: for gimp plugin java-runtime: for JNI bindings installing libmodplug... installing libasyncns... installing libpulse... Optional dependencies for libpulse glib2: mainloop integration [installed] pulse-native-provider: PulseAudio backend installing portaudio... installing libopenmpt... installing vulkan-icd-loader... Optional dependencies for vulkan-icd-loader vulkan-driver: packaged vulkan driver installing spirv-tools... installing glslang... installing jbigkit... installing libtiff... Optional dependencies for libtiff freeglut: for using tiffgt installing lcms2... installing shaderc... installing libdovi... installing xxhash... installing libplacebo... installing gdk-pixbuf2... Optional dependencies for gdk-pixbuf2 libwmf: Load .wmf and .apm libopenraw: Load .dng, .cr2, .crw, .nef, .orf, .pef, .arw, .erf, .mrw, and .raf libavif: Load .avif libheif: Load .heif, .heic, and .avif libjxl: Load .jxl [installed] librsvg: Load .svg, .svgz, and .svg.gz [pending] webp-pixbuf-loader: Load .webp installing libdatrie... installing libthai... installing libxft... installing pango... installing librsvg... installing libsoxr... installing libssh... installing libtheora... installing libva... Optional dependencies for libva intel-media-driver: backend for Intel GPUs (>= Broadwell) libva-intel-driver: backend for Intel GPUs (<= Haswell) libva-mesa-driver: backend for AMD and NVIDIA GPUs installing libvdpau... Optional dependencies for libvdpau libvdpau-va-gl: driver using VAAPI mesa-vdpau: driver for Mesa nvidia-utils: driver for NVIDIA installing libvpx... installing libwebp... installing libxv... installing ocl-icd... Optional dependencies for ocl-icd opencl-driver: packaged opencl driver installing libvpl... Optional dependencies for libvpl intel-media-sdk: runtime for legacy Intel GPUs onevpl-intel-gpu: runtime for Tiger Lake and newer GPUs installing opencore-amr... installing openjpeg2... installing rav1e... installing fftw... Optional dependencies for fftw fftw-openmpi: for OpenMPI integration installing rubberband... installing libxcursor... installing hidapi... Optional dependencies for hidapi libusb: for hidapi-libusb [installed] installing sdl2... Optional dependencies for sdl2 alsa-lib: ALSA audio driver [installed] libpulse: PulseAudio audio driver [installed] jack: JACK audio driver [installed] pipewire: PipeWire audio driver libdecor: Wayland client decorations installing snappy... installing speexdsp... installing speex... installing srt... installing svt-av1... installing hicolor-icon-theme... installing v4l-utils... Optional dependencies for v4l-utils qt5-base: for qv4l2 and qvidcap alsa-lib: for qv4l2 [installed] installing zimg... installing mpdecimal... installing python... Optional dependencies for python python-setuptools: for building Python packages using tooling that is usually bundled with Python python-pip: for installing Python packages using tooling that is usually bundled with Python python-pipx: for installing Python software not packaged on Arch Linux sqlite: for a default database integration [installed] xz: for lzma [installed] tk: for tkinter installing vapoursynth... installing vid.stab... installing vmaf... installing l-smash... installing x264... installing x265... installing xvidcore... installing ffmpeg... Optional dependencies for ffmpeg avisynthplus: AviSynthPlus support frei0r-plugins: Frei0r video effects support intel-media-sdk: Intel QuickSync support (legacy) ladspa: LADSPA filters nvidia-utils: Nvidia NVDEC/NVENC support onevpl-intel-gpu: Intel QuickSync support installing libxcomposite... installing libxkbfile... installing libxdamage... installing libxrandr... installing libxslt... Optional dependencies for libxslt python: Python bindings [installed] installing libxi... installing libxtst... installing minizip... installing nspr... installing nss... installing qt6-positioning... Optional dependencies for qt6-positioning geoclue: geoclue2 plugin qt6-declarative: QML bindings [installed] qt6-serialport: NMEA plugin installing qt6-webchannel... installing gnu-free-fonts... installing qt6-webengine... Optional dependencies for qt6-webengine pipewire: WebRTC desktop sharing under Wayland installing qt6-shadertools... installing qt6-5compat... Optional dependencies for qt6-5compat qt6-declarative: for QtGraphicalEffects [installed] installing qt6-wayland... installing jsoncpp... Optional dependencies for jsoncpp jsoncpp-doc: documentation installing libuv... installing rhash... installing cppdap... installing cmake... Optional dependencies for cmake make: for unix Makefile generator [installed] ninja: for ninja generator qt6-base: cmake-gui [installed] installing vulkan-tools... :: Running post-transaction hooks... ( 1/12) Creating system user accounts... Creating group 'avahi' with GID 972. Creating user 'avahi' (Avahi mDNS/DNS-SD daemon) with UID 972 and GID 972. ( 2/12) Reloading system manager configuration... Skipped: Current root is not booted. ( 3/12) Reloading user manager configuration... Skipped: Current root is not booted. ( 4/12) Updating udev hardware database... ( 5/12) Reloading device manager configuration... Skipped: Device manager is not running. ( 6/12) Arming ConditionNeedsUpdate... ( 7/12) Updating the MIME type database... ( 8/12) Updating fontconfig configuration... ( 9/12) Reloading system bus configuration... Skipped: Current root is not booted. (10/12) Updating fontconfig cache... (11/12) Probing GDK-Pixbuf loader modules... (12/12) Updating the info directory file... ==> Making package: gpt4all-chat-git r1747.67843edc-1 (Thu 02 May 2024 11:35:00 AM -03) ==> Checking runtime dependencies... ==> Checking buildtime dependencies... ==> Retrieving sources... -> Cloning gpt4all git repo... Cloning into bare repository '/home/main-builder/pkgsrc/gpt4all'... -> Found gpt4all-chat.desktop -> Found gpt4all-chat.sh ==> WARNING: Skipping verification of source file PGP signatures. ==> Validating source files with sha256sums... gpt4all ... Skipped gpt4all-chat.desktop ... Passed gpt4all-chat.sh ... Passed ==> Extracting sources... -> Creating working copy of gpt4all git repo... Cloning into 'gpt4all'... done. ==> Starting prepare()... Submodule 'llama.cpp-mainline' (https://github.com/nomic-ai/llama.cpp.git) registered for path 'gpt4all-backend/llama.cpp-mainline' Cloning into '/home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline'... Submodule path 'gpt4all-backend/llama.cpp-mainline': checked out 'a3f03b7e793ee611c4918235d4532ee535a9530d' Submodule 'kompute' (https://github.com/nomic-ai/kompute.git) registered for path 'gpt4all-backend/llama.cpp-mainline/kompute' Cloning into '/home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute'... Submodule path 'gpt4all-backend/llama.cpp-mainline/kompute': checked out 'd1e3b0953cf66acc94b2e29693e221427b2c1f3f' ==> Starting pkgver()... ==> Updated version: gpt4all-chat-git r1785.855fd224-1 ==> Starting build()... -- The CXX compiler identification is GNU 13.2.1 -- The C compiler identification is GNU 13.2.1 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/sbin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /usr/sbin/cc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE -- Performing Test HAVE_STDATOMIC -- Performing Test HAVE_STDATOMIC - Success -- Found WrapAtomic: TRUE -- Found OpenGL: /usr/lib/libOpenGL.so -- Found WrapOpenGL: TRUE -- Found XKB: /usr/lib/libxkbcommon.so (found suitable version "1.7.0", minimum required is "0.5.0") -- Found WrapVulkanHeaders: /usr/include -- Found Wayland_Client: /usr/lib/libwayland-client.so (found version "1.22.0") -- Found Wayland_Server: /usr/lib/libwayland-server.so (found version "1.22.0") -- Found Wayland_Cursor: /usr/lib/libwayland-cursor.so (found version "1.22.0") -- Found Wayland_Egl: /usr/lib/libwayland-egl.so (found version "18.1.0") -- Found Wayland: /usr/lib/libwayland-client.so;/usr/lib/libwayland-server.so;/usr/lib/libwayland-cursor.so;/usr/lib/libwayland-egl.so (found suitable version "1.22.0", minimum required is "1.15") -- Found WaylandScanner: /usr/sbin/wayland-scanner -- qmake binary: QMAKE_EXECUTABLE-NOTFOUND -- Qt 6 root directory: /usr -- Interprocedural optimization support detected -- Kompute found -- Found Vulkan: /lib/libvulkan.so (found version "1.3.279") found components: glslc glslangValidator -- General purpose GPU compute framework built on Vulkan -- ======================================================= -- KOMPUTE_OPT_LOG_LEVEL: Critical -- KOMPUTE_OPT_USE_SPDLOG: OFF -- KOMPUTE_OPT_DISABLE_VK_DEBUG_LAYERS: ON -- KOMPUTE_OPT_DISABLE_VULKAN_VERSION_CHECK: ON -- KOMPUTE_OPT_BUILD_SHADERS: OFF -- KOMPUTE_OPT_USE_BUILT_IN_SPDLOG: ON -- KOMPUTE_OPT_SPDLOG_ASYNC_MODE: OFF -- KOMPUTE_OPT_USE_BUILT_IN_FMT: ON -- KOMPUTE_OPT_USE_BUILT_IN_VULKAN_HEADER: ON -- KOMPUTE_OPT_BUILT_IN_VULKAN_HEADER_TAG: v1.3.231 -- ======================================================= -- Version: 10.0.0 -- Build type: Release -- Using log level Critical -- shaderop_scale.h generating SHADEROP_SCALE_H -- shaderop_scale_8.h generating SHADEROP_SCALE_8_H -- shaderop_add.h generating SHADEROP_ADD_H -- shaderop_addrow.h generating SHADEROP_ADDROW_H -- shaderop_mul.h generating SHADEROP_MUL_H -- shaderop_silu.h generating SHADEROP_SILU_H -- shaderop_relu.h generating SHADEROP_RELU_H -- shaderop_gelu.h generating SHADEROP_GELU_H -- shaderop_softmax.h generating SHADEROP_SOFTMAX_H -- shaderop_norm.h generating SHADEROP_NORM_H -- shaderop_rmsnorm.h generating SHADEROP_RMSNORM_H -- shaderop_diagmask.h generating SHADEROP_DIAGMASK_H -- shaderop_mul_mat_mat_f32.h generating SHADEROP_MUL_MAT_MAT_F32_H -- shaderop_mul_mat_f16.h generating SHADEROP_MUL_MAT_F16_H -- shaderop_mul_mat_q8_0.h generating SHADEROP_MUL_MAT_Q8_0_H -- shaderop_mul_mat_q4_0.h generating SHADEROP_MUL_MAT_Q4_0_H -- shaderop_mul_mat_q4_1.h generating SHADEROP_MUL_MAT_Q4_1_H -- shaderop_mul_mat_q6_k.h generating SHADEROP_MUL_MAT_Q6_K_H -- shaderop_getrows_f16.h generating SHADEROP_GETROWS_F16_H -- shaderop_getrows_q4_0.h generating SHADEROP_GETROWS_Q4_0_H -- shaderop_getrows_q4_1.h generating SHADEROP_GETROWS_Q4_1_H -- shaderop_getrows_q6_k.h generating SHADEROP_GETROWS_Q6_K_H -- shaderop_rope_f16.h generating SHADEROP_ROPE_F16_H -- shaderop_rope_f32.h generating SHADEROP_ROPE_F32_H -- shaderop_cpy_f16_f16.h generating SHADEROP_CPY_F16_F16_H -- shaderop_cpy_f16_f32.h generating SHADEROP_CPY_F16_F32_H -- shaderop_cpy_f32_f16.h generating SHADEROP_CPY_F32_F16_H -- shaderop_cpy_f32_f32.h generating SHADEROP_CPY_F32_F32_H -- CMAKE_SYSTEM_PROCESSOR: x86_64 -- x86 detected -- Configuring ggml implementation target llama-mainline-default in /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline -- x86 detected -- Configuring model implementation target llamamodel-mainline-default -- Configuring model implementation target gptj-default -- Configuring ggml implementation target llama-mainline-avxonly in /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline -- x86 detected -- Configuring model implementation target llamamodel-mainline-avxonly -- Configuring model implementation target gptj-avxonly CMake Warning (dev) at /usr/lib/cmake/Qt6Core/Qt6CoreMacros.cmake:3126 (message): Qt policy QTP0001 is not set: ':/qt/qml/' is the default resource prefix for QML modules. Check https://doc.qt.io/qt-6/qt-cmake-policy-qtp0001.html for policy details. Use the qt_policy command to set the policy and suppress this warning. Call Stack (most recent call first): /usr/lib/cmake/Qt6Qml/Qt6QmlMacros.cmake:468 (__qt_internal_setup_policy) /usr/lib/cmake/Qt6Qml/Qt6QmlMacros.cmake:776 (qt6_add_qml_module) CMakeLists.txt:97 (qt_add_qml_module) This warning is for project developers. Use -Wno-dev to suppress it. -- Configuring done (20.0s) -- Generating done (0.1s) -- Build files have been written to: /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build [ 0%] Built target llmodel_autogen_timestamp_deps [ 23%] Built target chat_tooling [ 24%] Built target chat_qmlimportscan [ 25%] Generating ShaderLogisticRegression.hpp [ 25%] Generating ShaderOpMult.hpp [ 25%] Built target fmt_autogen_timestamp_deps [ 26%] Automatic MOC for target llmodel [ 26%] Built target xxd_autogen_timestamp_deps [ 26%] Built target kp_shader [ 26%] Automatic MOC for target fmt [ 27%] Automatic MOC for target xxd [ 27%] Built target llmodel_autogen [ 27%] Built target fmt_autogen [ 28%] Building CXX object llmodel/CMakeFiles/llmodel.dir/llmodel_autogen/mocs_compilation.cpp.o [ 28%] Built target xxd_autogen [ 28%] Building CXX object llmodel/CMakeFiles/llmodel.dir/llmodel.cpp.o [ 29%] Building CXX object llmodel/CMakeFiles/llmodel.dir/llmodel_shared.cpp.o [ 29%] Building CXX object llmodel/CMakeFiles/llmodel.dir/llmodel_c.cpp.o [ 29%] Building CXX object llmodel/llama.cpp-mainline/kompute/CMakeFiles/xxd.dir/xxd_autogen/mocs_compilation.cpp.o [ 30%] Building C object llmodel/llama.cpp-mainline/kompute/CMakeFiles/xxd.dir/external/bin/xxd.c.o [ 30%] Building CXX object _deps/fmt-build/CMakeFiles/fmt.dir/fmt_autogen/mocs_compilation.cpp.o [ 31%] Building CXX object _deps/fmt-build/CMakeFiles/fmt.dir/src/format.cc.o [ 31%] Building CXX object _deps/fmt-build/CMakeFiles/fmt.dir/src/os.cc.o [ 31%] Linking CXX executable ../../../bin/xxd [ 31%] Built target xxd [ 31%] Compiling kompute-shaders/op_scale.comp to kompute-shaders/op_scale.comp.spv [ 31%] Compiling kompute-shaders/op_scale_8.comp to kompute-shaders/op_scale_8.comp.spv [ 31%] Compiling kompute-shaders/op_add.comp to kompute-shaders/op_add.comp.spv [ 31%] Compiling kompute-shaders/op_addrow.comp to kompute-shaders/op_addrow.comp.spv [ 31%] Compiling kompute-shaders/op_mul.comp to kompute-shaders/op_mul.comp.spv [ 32%] Compiling kompute-shaders/op_silu.comp to kompute-shaders/op_silu.comp.spv [ 32%] Compiling kompute-shaders/op_relu.comp to kompute-shaders/op_relu.comp.spv [ 32%] Compiling kompute-shaders/op_gelu.comp to kompute-shaders/op_gelu.comp.spv [ 32%] Compiling kompute-shaders/op_softmax.comp to kompute-shaders/op_softmax.comp.spv [ 32%] Compiling kompute-shaders/op_norm.comp to kompute-shaders/op_norm.comp.spv [ 33%] Compiling kompute-shaders/op_rmsnorm.comp to kompute-shaders/op_rmsnorm.comp.spv [ 34%] Compiling kompute-shaders/op_diagmask.comp to kompute-shaders/op_diagmask.comp.spv [ 35%] Compiling kompute-shaders/op_mul_mat_mat_f32.comp to kompute-shaders/op_mul_mat_mat_f32.comp.spv [ 35%] Compiling kompute-shaders/op_mul_mat_f16.comp to kompute-shaders/op_mul_mat_f16.comp.spv [ 36%] Compiling kompute-shaders/op_mul_mat_q8_0.comp to kompute-shaders/op_mul_mat_q8_0.comp.spv [ 36%] Compiling kompute-shaders/op_mul_mat_q4_0.comp to kompute-shaders/op_mul_mat_q4_0.comp.spv [ 37%] Compiling kompute-shaders/op_mul_mat_q4_1.comp to kompute-shaders/op_mul_mat_q4_1.comp.spv [ 37%] Compiling kompute-shaders/op_mul_mat_q6_k.comp to kompute-shaders/op_mul_mat_q6_k.comp.spv [ 37%] Compiling kompute-shaders/op_getrows_f16.comp to kompute-shaders/op_getrows_f16.comp.spv [ 38%] Compiling kompute-shaders/op_getrows_q4_0.comp to kompute-shaders/op_getrows_q4_0.comp.spv [ 38%] Compiling kompute-shaders/op_getrows_q4_1.comp to kompute-shaders/op_getrows_q4_1.comp.spv [ 39%] Compiling kompute-shaders/op_getrows_q6_k.comp to kompute-shaders/op_getrows_q6_k.comp.spv [ 39%] Compiling kompute-shaders/op_rope_f16.comp to kompute-shaders/op_rope_f16.comp.spv [ 40%] Compiling kompute-shaders/op_rope_f32.comp to kompute-shaders/op_rope_f32.comp.spv [ 41%] Compiling kompute-shaders/op_cpy_f16_f16.comp to kompute-shaders/op_cpy_f16_f16.comp.spv [ 41%] Compiling kompute-shaders/op_cpy_f16_f32.comp to kompute-shaders/op_cpy_f16_f32.comp.spv [ 42%] Compiling kompute-shaders/op_cpy_f32_f16.comp to kompute-shaders/op_cpy_f32_f16.comp.spv [ 42%] Compiling kompute-shaders/op_cpy_f32_f32.comp to kompute-shaders/op_cpy_f32_f32.comp.spv [ 42%] Converting to hpp: shaderop_scale.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 43%] Converting to hpp: shaderop_scale_8.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 44%] Linking CXX shared library ../../bin/libfmt.so [ 44%] Converting to hpp: shaderop_add.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 45%] Converting to hpp: shaderop_addrow.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 45%] Converting to hpp: shaderop_mul.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 45%] Converting to hpp: shaderop_silu.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 46%] Converting to hpp: shaderop_relu.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 46%] Converting to hpp: shaderop_gelu.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 47%] Converting to hpp: shaderop_softmax.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 47%] Converting to hpp: shaderop_norm.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 48%] Converting to hpp: shaderop_diagmask.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 48%] Converting to hpp: shaderop_rmsnorm.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 48%] Converting to hpp: shaderop_mul_mat_mat_f32.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 50%] Converting to hpp: shaderop_mul_mat_f16.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 51%] Converting to hpp: shaderop_mul_mat_q8_0.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 52%] Converting to hpp: shaderop_mul_mat_q4_0.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 52%] Converting to hpp: shaderop_mul_mat_q4_1.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 52%] Converting to hpp: shaderop_mul_mat_q6_k.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 53%] Converting to hpp: shaderop_getrows_f16.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 53%] Converting to hpp: shaderop_getrows_q4_0.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 54%] Converting to hpp: shaderop_getrows_q4_1.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 54%] Converting to hpp: shaderop_getrows_q6_k.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 55%] Converting to hpp: shaderop_rope_f16.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 55%] Converting to hpp: shaderop_rope_f32.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 55%] Converting to hpp: shaderop_cpy_f16_f16.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 56%] Converting to hpp: shaderop_cpy_f16_f32.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 56%] Converting to hpp: shaderop_cpy_f32_f16.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 56%] Converting to hpp: shaderop_cpy_f32_f32.comp.spv /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin/xxd [ 56%] Built target generated_shaders [ 56%] Linking CXX shared library ../bin/libllmodel.so /usr/sbin/c++ -fPIC -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS -flto=auto -O3 -DNDEBUG -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -flto=auto -shared -Wl,-soname,libllmodel.so.0 -o ../bin/libllmodel.so.0.5.0 CMakeFiles/llmodel.dir/llmodel_autogen/mocs_compilation.cpp.o CMakeFiles/llmodel.dir/llmodel.cpp.o CMakeFiles/llmodel.dir/llmodel_shared.cpp.o CMakeFiles/llmodel.dir/llmodel_c.cpp.o [ 56%] Built target fmt [ 56%] Built target kp_logger_autogen_timestamp_deps [ 56%] Automatic MOC for target kp_logger [ 56%] Built target kp_logger_autogen [ 57%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/logger/CMakeFiles/kp_logger.dir/kp_logger_autogen/mocs_compilation.cpp.o [ 57%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/logger/CMakeFiles/kp_logger.dir/Logger.cpp.o [ 58%] Linking CXX static library libkp_logger.a [ 58%] Built target kp_logger [ 58%] Built target kompute_autogen_timestamp_deps [ 58%] Automatic MOC for target kompute [ 58%] Built target kompute_autogen [ 58%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/kompute_autogen/mocs_compilation.cpp.o [ 58%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/Algorithm.cpp.o [ 59%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/Manager.cpp.o [ 59%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpAlgoDispatch.cpp.o [ 60%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpMemoryBarrier.cpp.o [ 60%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpTensorCopy.cpp.o [ 61%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpTensorFill.cpp.o [ 61%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpTensorSyncDevice.cpp.o [ 61%] Built target llmodel [ 61%] Built target chat_autogen_timestamp_deps [ 61%] Automatic MOC for target chat [ 61%] Built target chat_autogen [ 61%] Running AUTOMOC file extraction for target chat [ 61%] Running rcc for resource chat_raw_qml_0 [ 62%] Running rcc for resource qmake_gpt4all [ 62%] Running moc --collect-json for target chat [ 62%] Generating meta_types/qt6chat_release_metatypes.json [ 62%] Automatic QML type registration for target chat [ 63%] Building CXX object CMakeFiles/chat.dir/chat_autogen/mocs_compilation.cpp.o [ 63%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpTensorSyncLocal.cpp.o [ 64%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpBufferSyncDevice.cpp.o [ 64%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/OpBufferSyncLocal.cpp.o [ 64%] Building CXX object CMakeFiles/chat.dir/main.cpp.o [ 65%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/Sequence.cpp.o [ 65%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/Tensor.cpp.o [ 65%] Building CXX object llmodel/llama.cpp-mainline/kompute/src/CMakeFiles/kompute.dir/Core.cpp.o [ 66%] Building CXX object CMakeFiles/chat.dir/chat.cpp.o [ 66%] Building CXX object CMakeFiles/chat.dir/chatllm.cpp.o [ 67%] Building CXX object CMakeFiles/chat.dir/chatlistmodel.cpp.o [ 67%] Building CXX object CMakeFiles/chat.dir/chatapi.cpp.o [ 67%] Building CXX object CMakeFiles/chat.dir/database.cpp.o [ 68%] Linking CXX static library libkompute.a [ 68%] Built target kompute [ 68%] Built target ggml-mainline-default_autogen_timestamp_deps [ 68%] Built target ggml-mainline-avxonly_autogen_timestamp_deps [ 69%] Automatic MOC for target ggml-mainline-default [ 69%] Built target ggml-mainline-default_autogen [ 69%] Automatic MOC for target ggml-mainline-avxonly [ 69%] Built target ggml-mainline-avxonly_autogen [ 69%] Ensuring shaders are generated before compiling ggml-kompute.cpp [ 69%] Building CXX object llmodel/CMakeFiles/ggml-mainline-default.dir/ggml-mainline-default_autogen/mocs_compilation.cpp.o [ 70%] Building C object llmodel/CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c:18915:13: warning: ‘ggml_opt_get_grad’ defined but not used [-Wunused-function] 18915 | static void ggml_opt_get_grad(int np, struct ggml_tensor * const ps[], float * g) { | ^~~~~~~~~~~~~~~~~ [ 71%] Building CXX object CMakeFiles/chat.dir/embeddings.cpp.o [ 71%] Building CXX object CMakeFiles/chat.dir/download.cpp.o [ 71%] Building C object llmodel/CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-alloc.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-alloc.c:448:13: warning: ‘ggml_gallocr_set_node_offset’ defined but not used [-Wunused-function] 448 | static void ggml_gallocr_set_node_offset(ggml_gallocr_t galloc, struct ggml_tensor * node, int buffer_id, size_t offset) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~ [ 72%] Building C object llmodel/CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-backend.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-backend.c:1078:13: warning: ‘ggml_backend_sched_print_assignments’ defined but not used [-Wunused-function] 1078 | static void ggml_backend_sched_print_assignments(ggml_backend_sched_t sched, struct ggml_cgraph * graph) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ [ 72%] Building C object llmodel/CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-quants.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-quants.c:1439:14: warning: ‘make_qkx1_quants’ defined but not used [-Wunused-function] 1439 | static float make_qkx1_quants(int n, int nmax, const float * restrict x, uint8_t * restrict L, float * restrict the_min, | ^~~~~~~~~~~~~~~~ [ 73%] Building CXX object CMakeFiles/chat.dir/embllm.cpp.o [ 73%] Building CXX object llmodel/CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-kompute.cpp.o [ 73%] Building CXX object CMakeFiles/chat.dir/localdocs.cpp.o [ 73%] Building CXX object CMakeFiles/chat.dir/localdocsmodel.cpp.o [ 74%] Building CXX object CMakeFiles/chat.dir/llm.cpp.o [ 74%] Building CXX object CMakeFiles/chat.dir/modellist.cpp.o [ 75%] Building CXX object CMakeFiles/chat.dir/mysettings.cpp.o [ 75%] Building CXX object CMakeFiles/chat.dir/network.cpp.o [ 76%] Building CXX object CMakeFiles/chat.dir/server.cpp.o [ 76%] Building CXX object CMakeFiles/chat.dir/logger.cpp.o [ 76%] Building CXX object CMakeFiles/chat.dir/responsetext.cpp.o [ 77%] Building CXX object CMakeFiles/chat.dir/chat_qmltyperegistrations.cpp.o [ 78%] Built target ggml-mainline-default [ 79%] Building CXX object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/ggml-mainline-avxonly_autogen/mocs_compilation.cpp.o [ 79%] Building C object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.c:18915:13: warning: ‘ggml_opt_get_grad’ defined but not used [-Wunused-function] 18915 | static void ggml_opt_get_grad(int np, struct ggml_tensor * const ps[], float * g) { | ^~~~~~~~~~~~~~~~~ [ 79%] Building C object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-alloc.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-alloc.c:448:13: warning: ‘ggml_gallocr_set_node_offset’ defined but not used [-Wunused-function] 448 | static void ggml_gallocr_set_node_offset(ggml_gallocr_t galloc, struct ggml_tensor * node, int buffer_id, size_t offset) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~ [ 80%] Building C object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-backend.c.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-backend.c:1078:13: warning: ‘ggml_backend_sched_print_assignments’ defined but not used [-Wunused-function] 1078 | static void ggml_backend_sched_print_assignments(ggml_backend_sched_t sched, struct ggml_cgraph * graph) { | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ [ 80%] Building CXX object CMakeFiles/chat.dir/build/.rcc/qrc_qmake_gpt4all.cpp.o [ 80%] Building C object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-quants.c.o [ 81%] Building CXX object CMakeFiles/chat.dir/build/.rcc/qrc_chat_raw_qml_0.cpp.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/responsetext.cpp: In member function ‘virtual void SyntaxHighlighter::highlightBlock(const QString&)’: /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/responsetext.cpp:838:49: warning: ‘constexpr typename std::add_const<_Tp>::type& qAsConst(T&) [with T = QList; typename std::add_const<_Tp>::type = const QList]’ is deprecated: Use std::as_const() instead. [-Wdeprecated-declarations] 838 | for (const HighlightingRule &rule : qAsConst(rules)) { | ~~~~~~~~^~~~~~~ In file included from /usr/include/qt6/QtCore/qforeach.h:11, from /usr/include/qt6/QtCore/qglobal.h:57, from /usr/include/qt6/QtCore/qnamespace.h:12, from /usr/include/qt6/QtCore/qobjectdefs.h:12, from /usr/include/qt6/QtCore/qobject.h:10, from /usr/include/qt6/QtCore/QObject:1, from /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/responsetext.h:4, from /home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/responsetext.cpp:1: /usr/include/qt6/QtCore/qttypetraits.h:33:45: note: declared here 33 | constexpr typename std::add_const::type &qAsConst(T &t) noexcept { return t; } | ^~~~~~~~ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml-quants.c:1439:14: warning: ‘make_qkx1_quants’ defined but not used [-Wunused-function] 1439 | static float make_qkx1_quants(int n, int nmax, const float * restrict x, uint8_t * restrict L, float * restrict the_min, | ^~~~~~~~~~~~~~~~ [ 81%] Built target llama-mainline-default_autogen_timestamp_deps [ 81%] Automatic MOC for target llama-mainline-default [ 81%] Built target llama-mainline-default_autogen [ 82%] Building CXX object llmodel/CMakeFiles/llama-mainline-default.dir/llama-mainline-default_autogen/mocs_compilation.cpp.o [ 82%] Building CXX object llmodel/CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o [ 83%] Building CXX object llmodel/CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-kompute.cpp.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp: In function ‘ggml_tensor* llm_build_kqv(ggml_context*, const llama_model&, const llama_hparams&, const llama_kv_cache&, ggml_cgraph*, ggml_tensor*, ggml_tensor*, ggml_tensor*, ggml_tensor*, ggml_tensor*, int64_t, int32_t, int32_t, float, const llm_build_cb&, int)’: /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5049:102: note: ‘#pragma message: TODO: ALiBi support in ggml_soft_max_ext is not implemented for Vulkan, and Kompute’ 5049 | #pragma message("TODO: ALiBi support in ggml_soft_max_ext is not implemented for Vulkan, and Kompute") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5050:87: note: ‘#pragma message: Falling back to ggml_alibi(). Will become an error in Mar 2024’ 5050 | #pragma message(" Falling back to ggml_alibi(). Will become an error in Mar 2024") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5051:73: note: ‘#pragma message: ref: https://github.com/ggerganov/llama.cpp/pull/5488’ 5051 | #pragma message("ref: https://github.com/ggerganov/llama.cpp/pull/5488") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5056:24: warning: ‘ggml_tensor* ggml_alibi(ggml_context*, ggml_tensor*, int, int, float)’ is deprecated: use ggml_soft_max_ext instead (will be removed in Mar 2024) [-Wdeprecated-declarations] 5056 | kq = ggml_alibi(ctx, kq, /*n_past*/ 0, n_head, hparams.f_max_alibi_bias); | ~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:4, from /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:2: /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.h:1528:51: note: declared here 1528 | GGML_DEPRECATED(GGML_API struct ggml_tensor * ggml_alibi( | ^~~~~~~~~~ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.h:202:41: note: in definition of macro ‘GGML_DEPRECATED’ 202 | # define GGML_DEPRECATED(func, hint) func __attribute__((deprecated(hint))) | ^~~~ [ 83%] Linking CXX executable bin/chat [ 84%] Built target ggml-mainline-avxonly [ 84%] Built target llama-mainline-avxonly_autogen_timestamp_deps [ 85%] Automatic MOC for target llama-mainline-avxonly [ 85%] Built target llama-mainline-avxonly_autogen [ 86%] Building CXX object llmodel/CMakeFiles/llama-mainline-avxonly.dir/llama-mainline-avxonly_autogen/mocs_compilation.cpp.o [ 86%] Building CXX object llmodel/CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp: In function ‘ggml_tensor* llm_build_kqv(ggml_context*, const llama_model&, const llama_hparams&, const llama_kv_cache&, ggml_cgraph*, ggml_tensor*, ggml_tensor*, ggml_tensor*, ggml_tensor*, ggml_tensor*, int64_t, int32_t, int32_t, float, const llm_build_cb&, int)’: /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5049:102: note: ‘#pragma message: TODO: ALiBi support in ggml_soft_max_ext is not implemented for Vulkan, and Kompute’ 5049 | #pragma message("TODO: ALiBi support in ggml_soft_max_ext is not implemented for Vulkan, and Kompute") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5050:87: note: ‘#pragma message: Falling back to ggml_alibi(). Will become an error in Mar 2024’ 5050 | #pragma message(" Falling back to ggml_alibi(). Will become an error in Mar 2024") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5051:73: note: ‘#pragma message: ref: https://github.com/ggerganov/llama.cpp/pull/5488’ 5051 | #pragma message("ref: https://github.com/ggerganov/llama.cpp/pull/5488") | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:5056:24: warning: ‘ggml_tensor* ggml_alibi(ggml_context*, ggml_tensor*, int, int, float)’ is deprecated: use ggml_soft_max_ext instead (will be removed in Mar 2024) [-Wdeprecated-declarations] 5056 | kq = ggml_alibi(ctx, kq, /*n_past*/ 0, n_head, hparams.f_max_alibi_bias); | ~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ In file included from /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.h:4, from /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/llama.cpp:2: /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.h:1528:51: note: declared here 1528 | GGML_DEPRECATED(GGML_API struct ggml_tensor * ggml_alibi( | ^~~~~~~~~~ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/ggml.h:202:41: note: in definition of macro ‘GGML_DEPRECATED’ 202 | # define GGML_DEPRECATED(func, hint) func __attribute__((deprecated(hint))) | ^~~~ [ 87%] Built target chat [ 88%] Linking CXX static library libllama-mainline-default.a /usr/sbin/ar qc libllama-mainline-default.a "CMakeFiles/llama-mainline-default.dir/llama-mainline-default_autogen/mocs_compilation.cpp.o" "CMakeFiles/llama-mainline-default.dir/llama.cpp-mainline/llama.cpp.o" "CMakeFiles/ggml-mainline-default.dir/ggml-mainline-default_autogen/mocs_compilation.cpp.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml.c.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-alloc.c.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-backend.c.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-quants.c.o" "CMakeFiles/ggml-mainline-default.dir/llama.cpp-mainline/ggml-kompute.cpp.o" /usr/sbin/ranlib libllama-mainline-default.a [ 88%] Built target llama-mainline-default [ 88%] Built target llamamodel-mainline-default_autogen_timestamp_deps [ 88%] Built target gptj-default_autogen_timestamp_deps [ 88%] Automatic MOC for target llamamodel-mainline-default [ 89%] Automatic MOC for target gptj-default [ 89%] Built target llamamodel-mainline-default_autogen [ 89%] Built target gptj-default_autogen [ 89%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-default.dir/llamamodel-mainline-default_autogen/mocs_compilation.cpp.o [ 90%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o [ 90%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o [ 91%] Building CXX object llmodel/CMakeFiles/gptj-default.dir/gptj-default_autogen/mocs_compilation.cpp.o [ 91%] Building CXX object llmodel/CMakeFiles/gptj-default.dir/gptj.cpp.o [ 91%] Building CXX object llmodel/CMakeFiles/gptj-default.dir/utils.cpp.o [ 92%] Building CXX object llmodel/CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o [ 93%] Linking CXX shared library ../bin/libllamamodel-mainline-default.so /usr/sbin/c++ -fPIC -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS -flto=auto -O3 -DNDEBUG -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -flto=auto -shared -Wl,-soname,libllamamodel-mainline-default.so -o ../bin/libllamamodel-mainline-default.so "CMakeFiles/llamamodel-mainline-default.dir/llamamodel-mainline-default_autogen/mocs_compilation.cpp.o" "CMakeFiles/llamamodel-mainline-default.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-mainline-default.dir/llmodel_shared.cpp.o" -Wl,-rpath,/home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin: libllama-mainline-default.a llama.cpp-mainline/kompute/src/libkompute.a llama.cpp-mainline/kompute/src/logger/libkp_logger.a ../bin/libfmt.so.10.0.0 /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: warning: type ‘struct Manager’ violates the C++ One Definition Rule [-Wodr] 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: note: a different type is defined in another translation unit 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:273:32: note: the first difference of corresponding definitions is field ‘mDebugReportCallback’ 273 | vk::DebugReportCallbackEXT mDebugReportCallback; | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: note: a type with different number of fields is defined in another translation unit 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:238:37: warning: type of ‘listDevices’ does not match original declaration [-Wlto-type-mismatch] 238 | std::vector listDevices() const; | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:500:1: note: ‘listDevices’ was previously declared here 500 | Manager::listDevices() const | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:500:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:54:10: warning: type of ‘initializeDevice’ does not match original declaration [-Wlto-type-mismatch] 54 | void initializeDevice(uint32_t physicalDeviceIndex, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:46:6: note: ‘initializeDevice’ was previously declared here 46 | void Manager::initializeDevice(uint32_t physicalDeviceIndex, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:46:6: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:67:31: warning: type of ‘sequence’ does not match original declaration [-Wlto-type-mismatch] 67 | std::shared_ptr sequence(uint32_t queueIndex = 0, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:475:1: note: ‘sequence’ was previously declared here 475 | Manager::sequence(uint32_t queueIndex, uint32_t totalTimestamps) | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:475:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:25:5: warning: type of ‘__ct_comp ’ does not match original declaration [-Wlto-type-mismatch] 25 | Manager(); | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:35:1: note: ‘__ct_comp ’ was previously declared here 35 | Manager::Manager() | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:35:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:31:5: warning: type of ‘__dt_comp ’ does not match original declaration [-Wlto-type-mismatch] 31 | ~Manager(); | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:54:1: note: ‘__dt_comp ’ was previously declared here 54 | Manager::~Manager() | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:54:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used [ 93%] Linking CXX shared library ../bin/libgptj-default.so /usr/sbin/c++ -fPIC -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS -flto=auto -O3 -DNDEBUG -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -flto=auto -shared -Wl,-soname,libgptj-default.so -o ../bin/libgptj-default.so "CMakeFiles/gptj-default.dir/gptj-default_autogen/mocs_compilation.cpp.o" "CMakeFiles/gptj-default.dir/gptj.cpp.o" "CMakeFiles/gptj-default.dir/utils.cpp.o" "CMakeFiles/gptj-default.dir/llmodel_shared.cpp.o" -Wl,-rpath,/home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin: libllama-mainline-default.a llama.cpp-mainline/kompute/src/libkompute.a llama.cpp-mainline/kompute/src/logger/libkp_logger.a ../bin/libfmt.so.10.0.0 [ 93%] Linking CXX static library libllama-mainline-avxonly.a /usr/sbin/ar qc libllama-mainline-avxonly.a "CMakeFiles/llama-mainline-avxonly.dir/llama-mainline-avxonly_autogen/mocs_compilation.cpp.o" "CMakeFiles/llama-mainline-avxonly.dir/llama.cpp-mainline/llama.cpp.o" "CMakeFiles/ggml-mainline-avxonly.dir/ggml-mainline-avxonly_autogen/mocs_compilation.cpp.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml.c.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-alloc.c.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-backend.c.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-quants.c.o" "CMakeFiles/ggml-mainline-avxonly.dir/llama.cpp-mainline/ggml-kompute.cpp.o" /usr/sbin/ranlib libllama-mainline-avxonly.a [ 93%] Built target llama-mainline-avxonly [ 93%] Built target llamamodel-mainline-avxonly_autogen_timestamp_deps [ 93%] Built target gptj-avxonly_autogen_timestamp_deps [ 93%] Automatic MOC for target llamamodel-mainline-avxonly [ 93%] Built target llamamodel-mainline-avxonly_autogen [ 94%] Automatic MOC for target gptj-avxonly [ 94%] Built target gptj-avxonly_autogen [ 95%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel-mainline-avxonly_autogen/mocs_compilation.cpp.o [ 95%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o [ 96%] Building CXX object llmodel/CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o [ 96%] Built target gptj-default [ 97%] Building CXX object llmodel/CMakeFiles/gptj-avxonly.dir/gptj-avxonly_autogen/mocs_compilation.cpp.o [ 97%] Building CXX object llmodel/CMakeFiles/gptj-avxonly.dir/gptj.cpp.o [ 97%] Building CXX object llmodel/CMakeFiles/gptj-avxonly.dir/utils.cpp.o [ 97%] Linking CXX shared library ../bin/libllamamodel-mainline-avxonly.so /usr/sbin/c++ -fPIC -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS -flto=auto -O3 -DNDEBUG -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -flto=auto -shared -Wl,-soname,libllamamodel-mainline-avxonly.so -o ../bin/libllamamodel-mainline-avxonly.so "CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel-mainline-avxonly_autogen/mocs_compilation.cpp.o" "CMakeFiles/llamamodel-mainline-avxonly.dir/llamamodel.cpp.o" "CMakeFiles/llamamodel-mainline-avxonly.dir/llmodel_shared.cpp.o" -Wl,-rpath,/home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin: libllama-mainline-avxonly.a llama.cpp-mainline/kompute/src/libkompute.a llama.cpp-mainline/kompute/src/logger/libkp_logger.a ../bin/libfmt.so.10.0.0 /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: warning: type ‘struct Manager’ violates the C++ One Definition Rule [-Wodr] 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: note: a different type is defined in another translation unit 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:273:32: note: the first difference of corresponding definitions is field ‘mDebugReportCallback’ 273 | vk::DebugReportCallbackEXT mDebugReportCallback; | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:19:7: note: a type with different number of fields is defined in another translation unit 19 | class Manager | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:238:37: warning: type of ‘listDevices’ does not match original declaration [-Wlto-type-mismatch] 238 | std::vector listDevices() const; | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:500:1: note: ‘listDevices’ was previously declared here 500 | Manager::listDevices() const | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:500:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:54:10: warning: type of ‘initializeDevice’ does not match original declaration [-Wlto-type-mismatch] 54 | void initializeDevice(uint32_t physicalDeviceIndex, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:46:6: note: ‘initializeDevice’ was previously declared here 46 | void Manager::initializeDevice(uint32_t physicalDeviceIndex, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:46:6: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:67:31: warning: type of ‘sequence’ does not match original declaration [-Wlto-type-mismatch] 67 | std::shared_ptr sequence(uint32_t queueIndex = 0, | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:475:1: note: ‘sequence’ was previously declared here 475 | Manager::sequence(uint32_t queueIndex, uint32_t totalTimestamps) | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:475:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:25:5: warning: type of ‘__ct_comp ’ does not match original declaration [-Wlto-type-mismatch] 25 | Manager(); | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:35:1: note: ‘__ct_comp ’ was previously declared here 35 | Manager::Manager() | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:35:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/include/kompute/Manager.hpp:31:5: warning: type of ‘__dt_comp ’ does not match original declaration [-Wlto-type-mismatch] 31 | ~Manager(); | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:54:1: note: ‘__dt_comp ’ was previously declared here 54 | Manager::~Manager() | ^ /home/main-builder/pkgwork/src/gpt4all/gpt4all-backend/llama.cpp-mainline/kompute/src/Manager.cpp:54:1: note: code may be misoptimized unless ‘-fno-strict-aliasing’ is used [ 98%] Building CXX object llmodel/CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o [ 98%] Linking CXX shared library ../bin/libgptj-avxonly.so /usr/sbin/c++ -fPIC -march=x86-64 -mtune=generic -O2 -pipe -fno-plt -fexceptions -Wp,-D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security -fstack-clash-protection -fcf-protection -Wp,-D_GLIBCXX_ASSERTIONS -flto=auto -O3 -DNDEBUG -Wl,-O1,--sort-common,--as-needed,-z,relro,-z,now -flto=auto -shared -Wl,-soname,libgptj-avxonly.so -o ../bin/libgptj-avxonly.so "CMakeFiles/gptj-avxonly.dir/gptj-avxonly_autogen/mocs_compilation.cpp.o" "CMakeFiles/gptj-avxonly.dir/gptj.cpp.o" "CMakeFiles/gptj-avxonly.dir/utils.cpp.o" "CMakeFiles/gptj-avxonly.dir/llmodel_shared.cpp.o" -Wl,-rpath,/home/main-builder/pkgwork/src/gpt4all/gpt4all-chat/build/bin: libllama-mainline-avxonly.a llama.cpp-mainline/kompute/src/libkompute.a llama.cpp-mainline/kompute/src/logger/libkp_logger.a ../bin/libfmt.so.10.0.0 [100%] Built target llamamodel-mainline-default [100%] Built target gptj-avxonly [100%] Built target llamamodel-mainline-avxonly ==> Entering fakeroot environment... ==> Starting package()... [ 23%] Built target chat_tooling [ 23%] Built target llmodel_autogen_timestamp_deps [ 24%] Built target kp_shader [ 25%] Built target chat_qmlimportscan [ 25%] Built target fmt_autogen_timestamp_deps [ 25%] Built target xxd_autogen_timestamp_deps [ 26%] Built target llmodel_autogen [ 26%] Built target fmt_autogen [ 27%] Built target xxd_autogen [ 29%] Built target llmodel [ 31%] Built target fmt [ 32%] Built target xxd [ 32%] Built target chat_autogen_timestamp_deps [ 32%] Built target kp_logger_autogen_timestamp_deps [ 56%] Built target generated_shaders [ 56%] Built target chat_autogen [ 56%] Built target kp_logger_autogen [ 58%] Built target kp_logger [ 58%] Built target kompute_autogen_timestamp_deps [ 70%] Built target chat [ 70%] Built target kompute_autogen [ 76%] Built target kompute [ 76%] Built target ggml-mainline-default_autogen_timestamp_deps [ 76%] Built target ggml-mainline-avxonly_autogen_timestamp_deps [ 77%] Built target ggml-mainline-default_autogen [ 77%] Built target ggml-mainline-avxonly_autogen [ 80%] Built target ggml-mainline-default [ 84%] Built target ggml-mainline-avxonly [ 84%] Built target llama-mainline-default_autogen_timestamp_deps [ 84%] Built target llama-mainline-avxonly_autogen_timestamp_deps [ 84%] Built target llama-mainline-default_autogen [ 85%] Built target llama-mainline-avxonly_autogen [ 87%] Built target llama-mainline-default [ 87%] Built target llamamodel-mainline-default_autogen_timestamp_deps [ 88%] Built target llama-mainline-avxonly [ 88%] Built target gptj-default_autogen_timestamp_deps [ 88%] Built target llamamodel-mainline-default_autogen [ 88%] Built target llamamodel-mainline-avxonly_autogen_timestamp_deps [ 88%] Built target gptj-avxonly_autogen_timestamp_deps [ 89%] Built target gptj-default_autogen [ 92%] Built target llamamodel-mainline-default [ 92%] Built target llamamodel-mainline-avxonly_autogen [ 93%] Built target gptj-avxonly_autogen [ 95%] Built target gptj-default [ 97%] Built target llamamodel-mainline-avxonly [100%] Built target gptj-avxonly Install the project... -- Install configuration: "Release" -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_xlib_xrandr.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_xlib.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_xcb.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_win32.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_wayland.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_vi.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_to_string.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_structs.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_static_assertions.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_screen.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_raii.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_metal.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_macos.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_ios.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_hash.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_handles.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_ggp.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_funcs.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_fuchsia.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_format_traits.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_enums.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_directfb.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_core.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_beta.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan_android.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vulkan.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vk_sdk_platform.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vk_platform.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vk_layer.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vulkan/vk_icd.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codecs_common.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h265std_encode.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h265std_decode.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h265std.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h264std_encode.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h264std_decode.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/vk_video/vulkan_video_codec_h264std.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/vkconventions.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/vk.xml -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/video.xml -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/validusage.json -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/spec_tools -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/spec_tools/util.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/spec_tools/conventions.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/reg.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/genvk.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/generator.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/cgenerator.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/share/vulkan/registry/apiconventions.py -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libfmt.so.10.0.0 -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libfmt.so.10 -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libfmt.so -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/args.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/chrono.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/color.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/compile.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/core.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/format.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/format-inl.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/os.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/ostream.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/printf.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/ranges.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/std.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/fmt/xchar.h -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/cmake/fmt/fmt-config.cmake -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/cmake/fmt/fmt-config-version.cmake -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/cmake/fmt/fmt-targets.cmake -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/cmake/fmt/fmt-targets-release.cmake -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/pkgconfig/fmt.pc -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libkompute.a -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/ShaderOpMult.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/include/ShaderLogisticRegression.hpp -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/bin/chat -- Set non-toolchain portion of runtime path of "/home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/bin/chat" to "" -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllmodel.so.0.5.0 -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllmodel.so.0 -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllmodel.so -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libgptj-avxonly.so -- Set non-toolchain portion of runtime path of "/home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libgptj-avxonly.so" to "" -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libgptj-default.so -- Set non-toolchain portion of runtime path of "/home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libgptj-default.so" to "" -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllama-mainline-avxonly.a -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllama-mainline-default.a -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllamamodel-mainline-avxonly.so -- Set non-toolchain portion of runtime path of "/home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllamamodel-mainline-avxonly.so" to "" -- Installing: /home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllamamodel-mainline-default.so -- Set non-toolchain portion of runtime path of "/home/main-builder/pkgwork/pkg/gpt4all-chat-git/opt/gpt4all-chat/lib/libllamamodel-mainline-default.so" to "" ==> Tidying install... -> Removing libtool files... -> Purging unwanted files... -> Removing static library files... -> Stripping unneeded symbols from binaries and libraries... strip: ./opt/gpt4all-chat/lib/sttMXX94/mocs_compilation.cpp.o: plugin needed to handle lto object strip: ./opt/gpt4all-chat/lib/st9VqFNQ/mocs_compilation.cpp.o: plugin needed to handle lto object strip: ./opt/gpt4all-chat/lib/staxWu5J/mocs_compilation.cpp.o: plugin needed to handle lto object strip: ./opt/gpt4all-chat/lib/stm5JcLD/mocs_compilation.cpp.o: plugin needed to handle lto object strip: ./opt/gpt4all-chat/lib/stm17kSb/mocs_compilation.cpp.o: plugin needed to handle lto object strip: ./opt/gpt4all-chat/lib/st7jtVQg/mocs_compilation.cpp.o: plugin needed to handle lto object -> Compressing man and info pages... ==> Checking for packaging issues... ==> WARNING: Package contains reference to $srcdir opt/gpt4all-chat/lib/libllamamodel-mainline-default.so opt/gpt4all-chat/lib/libllamamodel-mainline-avxonly.so opt/gpt4all-chat/lib/libgptj-default.so opt/gpt4all-chat/lib/libgptj-avxonly.so ==> Creating package "gpt4all-chat-git"... -> Generating .PKGINFO file... -> Generating .BUILDINFO file... -> Generating .MTREE file... -> Compressing package... ==> Leaving fakeroot environment. ==> Finished making: gpt4all-chat-git r1785.855fd224-1 (Thu 02 May 2024 11:38:38 AM -03) real 4m37.105s user 16m32.623s sys 1m25.351s