422 Commits

Author SHA1 Message Date
2676515b4e Merge pull request 'Use spawn_blocking for ingest scan' (#286) from 283-web-server-stops-responding-during-ingest into main
All checks were successful
Push Workflows / leptos-test (push) Successful in 2m43s
Push Workflows / docs (push) Successful in 2m16s
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / clippy (push) Successful in 1m26s
Push Workflows / mdbook (push) Successful in 15s
Push Workflows / test (push) Successful in 3m16s
Push Workflows / mdbook-server (push) Successful in 35s
Push Workflows / build (push) Successful in 6m24s
Push Workflows / nix-build (push) Successful in 9m19s
Push Workflows / docker-build (push) Successful in 11m33s
Reviewed-on: #286
2025-12-01 22:36:55 +00:00
c498e1a7fa Use spawn_blocking for ingest scan
All checks were successful
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 18s
Push Workflows / clippy (push) Successful in 1m3s
Push Workflows / docs (push) Successful in 1m33s
Push Workflows / mdbook-server (push) Successful in 1m23s
Push Workflows / leptos-test (push) Successful in 2m0s
Push Workflows / test (push) Successful in 2m8s
Push Workflows / build (push) Successful in 3m14s
Push Workflows / docker-build (push) Successful in 4m11s
Push Workflows / nix-build (push) Successful in 6m16s
2025-12-01 17:30:01 -05:00
93a41cfcbb Merge pull request '236-song-and-songlist-refactor' (#285) from 236-song-and-songlist-refactor into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 12s
Push Workflows / mdbook-server (push) Successful in 30s
Push Workflows / clippy (push) Successful in 52s
Push Workflows / docs (push) Successful in 1m17s
Push Workflows / leptos-test (push) Successful in 1m56s
Push Workflows / test (push) Successful in 2m4s
Push Workflows / build (push) Successful in 3m18s
Push Workflows / nix-build (push) Successful in 6m31s
Push Workflows / docker-build (push) Successful in 8m10s
Reviewed-on: #285
2025-12-01 22:13:16 +00:00
2e42d0e964 Remove old song list component
All checks were successful
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 14s
Push Workflows / docs (push) Successful in 2m40s
Push Workflows / mdbook-server (push) Successful in 2m30s
Push Workflows / leptos-test (push) Successful in 3m2s
Push Workflows / test (push) Successful in 3m13s
Push Workflows / clippy (push) Successful in 3m23s
Push Workflows / docker-build (push) Successful in 5m49s
Push Workflows / build (push) Successful in 6m58s
Push Workflows / nix-build (push) Successful in 7m49s
2025-12-01 16:53:46 -05:00
aa3132feb9 Change playlist page to use new song list 2025-12-01 16:51:27 -05:00
e2a11dc785 Change search page to use new song list 2025-12-01 16:51:12 -05:00
a66ea13bfa Change liked songs page to use new song list 2025-12-01 16:50:39 -05:00
a235728d67 Add remove_search_score function
Add SearchResults and remove_search_score to prelude
2025-12-01 16:49:42 -05:00
68526cc63b Change profile page recent songs to use new song list 2025-12-01 16:20:41 -05:00
da1a7c6d8b Change profile page top songs to use new song list 2025-12-01 16:14:59 -05:00
da7491584e Change artist page to use new song list 2025-12-01 16:09:14 -05:00
25669570c5 Add PlaysSongList 2025-12-01 16:09:14 -05:00
76ecaae81a Add SongPlays component 2025-12-01 16:09:10 -05:00
c90f318be1 Change album page to use new song list 2025-12-01 15:28:21 -05:00
9c90a674da Add ArtistList component 2025-12-01 15:17:46 -05:00
7500375f0c Implement new song list 2025-12-01 15:10:11 -05:00
db18298832 Set playbar noderef from GlobalState
Some checks failed
Push Workflows / rustfmt (push) Successful in 17s
Push Workflows / mdbook (push) Successful in 26s
Push Workflows / mdbook-server (push) Successful in 5m16s
Push Workflows / build (push) Failing after 6m1s
Push Workflows / clippy (push) Failing after 6m39s
Push Workflows / docs (push) Successful in 7m1s
Push Workflows / test (push) Successful in 9m36s
Push Workflows / leptos-test (push) Successful in 10m11s
Push Workflows / nix-build (push) Successful in 11m48s
Push Workflows / docker-build (push) Successful in 14m4s
2025-12-01 12:34:49 -05:00
5acdaf9d55 Add playbar_element to GlobalState 2025-12-01 12:34:31 -05:00
28effd4024 Fix styling of personal component to avoid resizing issues 2025-11-22 14:37:49 -05:00
636c811e24 Add get_songs_by_id API function 2025-11-22 14:35:58 -05:00
c5654fc9f7 clippy 2025-11-22 14:35:28 -05:00
dd13cdd2cc fmt 2025-11-22 14:31:18 -05:00
ff8f2ecfca Upgrade to Rust edition 2024 2025-11-22 14:30:50 -05:00
067ce69c4b Create structure for songs and song_list 2025-11-21 10:45:18 -05:00
b47f43f6e0 Merge pull request 'Use subqueries instead of two separate queries' (#280) from 279-use-subqueries-instead-of-multiple-queries into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 14s
Push Workflows / mdbook (push) Successful in 21s
Push Workflows / mdbook-server (push) Successful in 3m35s
Push Workflows / docs (push) Successful in 6m49s
Push Workflows / clippy (push) Successful in 7m13s
Push Workflows / test (push) Successful in 10m46s
Push Workflows / leptos-test (push) Successful in 11m24s
Push Workflows / nix-build (push) Successful in 12m26s
Push Workflows / build (push) Successful in 12m40s
Push Workflows / docker-build (push) Successful in 14m32s
Reviewed-on: #280
2025-11-03 22:57:19 +00:00
7574c2a690 Use subqueries instead of two separate queries
All checks were successful
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 16s
Push Workflows / mdbook-server (push) Successful in 3m14s
Push Workflows / docs (push) Successful in 5m59s
Push Workflows / clippy (push) Successful in 6m29s
Push Workflows / test (push) Successful in 8m37s
Push Workflows / build (push) Successful in 9m5s
Push Workflows / leptos-test (push) Successful in 9m15s
Push Workflows / nix-build (push) Successful in 10m42s
Push Workflows / docker-build (push) Successful in 12m29s
2025-11-03 12:21:43 -05:00
c2378c1304 Merge pull request 'Move ingest into main project' (#277) from 225-move-ingest-into-main-project into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 14s
Push Workflows / leptos-test (push) Has started running
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / docs (push) Has started running
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / mdbook-server (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / clippy (push) Has been cancelled
Reviewed-on: #277
2025-10-27 17:17:58 +00:00
689dfefec0 Implement ingest scanning
All checks were successful
Push Workflows / nix-build (push) Successful in 6m59s
Push Workflows / docker-build (push) Successful in 7m56s
Push Workflows / build (push) Successful in 5m11s
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / mdbook (push) Successful in 21s
Push Workflows / mdbook-server (push) Successful in 2m45s
Push Workflows / docs (push) Successful in 4m27s
Push Workflows / clippy (push) Successful in 4m38s
Push Workflows / test (push) Successful in 5m58s
Push Workflows / leptos-test (push) Successful in 6m22s
2025-10-27 12:40:27 -04:00
b6fc5cab73 Create functions for creating media in database 2025-10-27 12:24:18 -04:00
da4ffd967f Create functions for finding media in database 2025-10-27 12:21:57 -04:00
02ce795254 Add function to convert file path to LocalPath 2025-10-27 12:12:26 -04:00
19ed144283 Add config option to disable ingest 2025-10-27 12:09:44 -04:00
77912507b5 Use prelude in ingest 2025-10-27 12:03:43 -04:00
324acec659 Merge remote-tracking branch 'origin/main' into 225-move-ingest-into-main-project 2025-10-27 11:54:28 -04:00
6b1bfbc125 Merge pull request 'Fix song search limit' (#275) from 241-fix-song-search-limit into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 25s
Push Workflows / mdbook (push) Successful in 44s
Push Workflows / docker-build (push) Successful in 1m43s
Push Workflows / clippy (push) Successful in 1m57s
Push Workflows / docs (push) Successful in 2m14s
Push Workflows / mdbook-server (push) Successful in 2m10s
Push Workflows / test (push) Successful in 3m17s
Push Workflows / leptos-test (push) Successful in 3m34s
Push Workflows / build (push) Successful in 4m41s
Push Workflows / nix-build (push) Successful in 6m47s
Reviewed-on: #275
2025-10-22 00:48:45 +00:00
486034f63d Use limit in separate song_ids query
All checks were successful
Push Workflows / docs (push) Successful in 2m55s
Push Workflows / test (push) Successful in 4m24s
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / mdbook-server (push) Successful in 26s
Push Workflows / clippy (push) Successful in 3m44s
Push Workflows / leptos-test (push) Successful in 5m45s
Push Workflows / build (push) Successful in 7m46s
Push Workflows / nix-build (push) Successful in 8m9s
Push Workflows / docker-build (push) Successful in 11m42s
2025-10-21 17:33:48 -04:00
424349252e Merge pull request 'Fix album search' (#274) from 240-fix-album-search into main
All checks were successful
Push Workflows / clippy (push) Successful in 2m9s
Push Workflows / rustfmt (push) Successful in 50s
Push Workflows / mdbook (push) Successful in 42s
Push Workflows / docs (push) Successful in 2m58s
Push Workflows / mdbook-server (push) Successful in 2m3s
Push Workflows / test (push) Successful in 4m21s
Push Workflows / leptos-test (push) Successful in 4m45s
Push Workflows / docker-build (push) Successful in 9m8s
Push Workflows / build (push) Successful in 9m43s
Push Workflows / nix-build (push) Successful in 10m49s
Reviewed-on: #274
2025-10-21 21:31:41 +00:00
c49ff75983 Merge pull request 'Fix playlists still not available after login' (#273) from 260-fix-playlists-still-not-available-after into main
Some checks failed
Push Workflows / build (push) Has started running
Push Workflows / test (push) Has started running
Push Workflows / leptos-test (push) Has started running
Push Workflows / docs (push) Has started running
Push Workflows / nix-build (push) Has started running
Push Workflows / rustfmt (push) Has started running
Push Workflows / clippy (push) Has been cancelled
Push Workflows / mdbook (push) Has started running
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / mdbook-server (push) Has been cancelled
Reviewed-on: #273
2025-10-21 21:31:17 +00:00
1956fb0c30 Use left_join with artists in album search
All checks were successful
Push Workflows / docs (push) Successful in 2m16s
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / mdbook-server (push) Successful in 50s
Push Workflows / clippy (push) Successful in 2m9s
Push Workflows / test (push) Successful in 4m17s
Push Workflows / leptos-test (push) Successful in 5m20s
Push Workflows / build (push) Successful in 7m6s
Push Workflows / nix-build (push) Successful in 7m36s
Push Workflows / docker-build (push) Successful in 10m57s
2025-10-21 17:18:59 -04:00
600a8897a4 Fix Resource get/read warning
All checks were successful
Push Workflows / rustfmt (push) Successful in 11s
Push Workflows / mdbook (push) Successful in 32s
Push Workflows / clippy (push) Successful in 2m37s
Push Workflows / mdbook-server (push) Successful in 2m42s
Push Workflows / docs (push) Successful in 3m0s
Push Workflows / leptos-test (push) Successful in 4m49s
Push Workflows / test (push) Successful in 4m54s
Push Workflows / docker-build (push) Successful in 7m53s
Push Workflows / build (push) Successful in 8m33s
Push Workflows / nix-build (push) Successful in 6m45s
2025-10-21 17:16:49 -04:00
abb2a129d0 Return user id from resource tracking callback
All checks were successful
Push Workflows / docs (push) Successful in 4m26s
Push Workflows / leptos-test (push) Successful in 6m24s
Push Workflows / nix-build (push) Successful in 9m5s
Push Workflows / rustfmt (push) Successful in 19s
Push Workflows / mdbook (push) Successful in 24s
Push Workflows / clippy (push) Successful in 4m28s
Push Workflows / docker-build (push) Successful in 11m2s
Push Workflows / test (push) Successful in 5m16s
Push Workflows / build (push) Successful in 7m58s
Push Workflows / mdbook-server (push) Successful in 41s
2025-10-21 16:38:37 -04:00
c3316933a2 Merge pull request 'Add Diesel lock file to gitignore' (#272) from 271-add-diesel-lock-file-to-gitignore into main
Some checks failed
Push Workflows / leptos-test (push) Has been cancelled
Push Workflows / docs (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / clippy (push) Has been cancelled
Push Workflows / rustfmt (push) Has been cancelled
Push Workflows / mdbook (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / mdbook-server (push) Has been cancelled
Push Workflows / test (push) Has started running
Push Workflows / docker-build (push) Has started running
Reviewed-on: #272
2025-10-21 20:33:37 +00:00
b4158a5cc3 Add Diesel lock file to gitignore
All checks were successful
Push Workflows / rustfmt (push) Successful in 16s
Push Workflows / mdbook (push) Successful in 21s
Push Workflows / mdbook-server (push) Successful in 34s
Push Workflows / clippy (push) Successful in 2m0s
Push Workflows / docs (push) Successful in 2m7s
Push Workflows / leptos-test (push) Successful in 3m56s
Push Workflows / test (push) Successful in 4m25s
Push Workflows / build (push) Successful in 6m11s
Push Workflows / nix-build (push) Successful in 7m49s
Push Workflows / docker-build (push) Successful in 9m43s
2025-10-21 16:03:52 -04:00
5534ec1eb3 Merge pull request 'Import cleanup and prelude module' (#269) from 268-import-cleanup-and-prelude-module into main
Some checks failed
Push Workflows / mdbook (push) Successful in 25s
Push Workflows / clippy (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docs (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / mdbook-server (push) Has been cancelled
Push Workflows / rustfmt (push) Successful in 12s
Reviewed-on: #269
2025-10-21 02:41:27 +00:00
0cb32783bf Import cleanup
All checks were successful
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / mdbook-server (push) Successful in 30s
Push Workflows / docs (push) Successful in 1m21s
Push Workflows / clippy (push) Successful in 1m18s
Push Workflows / leptos-test (push) Successful in 2m26s
Push Workflows / test (push) Successful in 2m37s
Push Workflows / build (push) Successful in 5m7s
Push Workflows / nix-build (push) Successful in 6m58s
Push Workflows / docker-build (push) Successful in 8m9s
2025-10-20 22:32:51 -04:00
cc297be997 Add prelude module 2025-10-20 22:32:23 -04:00
1407d2d985 Merge pull request 'Use api_fn macro' (#267) from 265-use-apifn-macro into main
All checks were successful
Push Workflows / clippy (push) Successful in 1m44s
Push Workflows / mdbook-server (push) Successful in 37s
Push Workflows / leptos-test (push) Successful in 3m12s
Push Workflows / test (push) Successful in 3m48s
Push Workflows / build (push) Successful in 5m43s
Push Workflows / nix-build (push) Successful in 6m52s
Push Workflows / docker-build (push) Successful in 48s
Push Workflows / docs (push) Successful in 1m32s
Push Workflows / rustfmt (push) Successful in 12s
Push Workflows / mdbook (push) Successful in 14s
Reviewed-on: #267
2025-10-20 23:23:45 +00:00
0341efa99b Remove auth doc tests
All checks were successful
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 16s
Push Workflows / docs (push) Successful in 1m17s
Push Workflows / clippy (push) Successful in 1m28s
Push Workflows / mdbook-server (push) Successful in 1m27s
Push Workflows / leptos-test (push) Successful in 2m32s
Push Workflows / test (push) Successful in 2m52s
Push Workflows / build (push) Successful in 4m23s
Push Workflows / docker-build (push) Successful in 4m25s
Push Workflows / nix-build (push) Successful in 6m35s
2025-10-20 19:16:10 -04:00
c9c72c74ba Use api_fn macro
Some checks failed
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 11s
Push Workflows / docs (push) Successful in 1m30s
Push Workflows / mdbook-server (push) Successful in 1m27s
Push Workflows / clippy (push) Successful in 1m36s
Push Workflows / leptos-test (push) Failing after 2m30s
Push Workflows / test (push) Successful in 2m58s
Push Workflows / build (push) Successful in 4m29s
Push Workflows / nix-build (push) Successful in 6m38s
Push Workflows / docker-build (push) Successful in 7m5s
2025-10-20 18:49:39 -04:00
8afd5f9d20 Add LoggedInUser trait 2025-10-20 18:33:08 -04:00
5890b1486e Update libretunes_macro
All checks were successful
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 13s
Push Workflows / mdbook-server (push) Successful in 33s
Push Workflows / docs (push) Successful in 1m36s
Push Workflows / clippy (push) Successful in 1m59s
Push Workflows / test (push) Successful in 3m47s
Push Workflows / leptos-test (push) Successful in 3m46s
Push Workflows / build (push) Successful in 4m58s
Push Workflows / nix-build (push) Successful in 7m40s
Push Workflows / docker-build (push) Successful in 9m13s
2025-10-20 18:15:28 -04:00
dcca317d00 Merge pull request 'Create utility component for easier Resource use' (#264) from 263-create-utility-component-for-easier-resource into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 12s
Push Workflows / mdbook-server (push) Successful in 33s
Push Workflows / docs (push) Successful in 1m6s
Push Workflows / clippy (push) Successful in 1m11s
Push Workflows / leptos-test (push) Successful in 2m58s
Push Workflows / test (push) Successful in 3m5s
Push Workflows / build (push) Successful in 4m25s
Push Workflows / nix-build (push) Successful in 6m35s
Push Workflows / docker-build (push) Successful in 7m3s
Reviewed-on: #264
2025-10-19 23:47:39 +00:00
c6b70a49f2 Use LoadResource for search page
All checks were successful
Push Workflows / rustfmt (push) Successful in 26s
Push Workflows / mdbook (push) Successful in 57s
Push Workflows / clippy (push) Successful in 2m14s
Push Workflows / docs (push) Successful in 2m41s
Push Workflows / mdbook-server (push) Successful in 3m4s
Push Workflows / test (push) Successful in 3m30s
Push Workflows / leptos-test (push) Successful in 4m32s
Push Workflows / build (push) Successful in 5m21s
Push Workflows / docker-build (push) Successful in 5m52s
Push Workflows / nix-build (push) Successful in 7m33s
2025-10-19 19:39:02 -04:00
9ea3bf6aca Use LoadResource for playlist page 2025-10-19 19:34:30 -04:00
3328a698ba Use LoadResource for playlists on sidebar 2025-10-19 19:25:43 -04:00
f0c8ae731f Use LoadResource for profile page 2025-10-19 19:19:39 -04:00
2ed43348e1 Use LoadResource for album page 2025-10-19 19:13:17 -04:00
473583b3ef Use LoadResource for artist page 2025-10-19 19:08:41 -04:00
2cf36a499d Use LoadResource for song plays 2025-10-19 19:08:33 -04:00
09ebceec6c Use LoadResource for song info 2025-10-19 18:48:05 -04:00
1615622821 Use LoadResource for liked songs page 2025-10-19 18:29:00 -04:00
2abfba8c54 Create LoadResource component 2025-10-19 18:28:17 -04:00
777dc2c01e Merge pull request 'Bump libretunes-macro' (#259) from 258-bump-libretunesmacro into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 28s
Push Workflows / mdbook (push) Successful in 1m17s
Push Workflows / clippy (push) Successful in 1m58s
Push Workflows / docs (push) Successful in 2m16s
Push Workflows / mdbook-server (push) Successful in 2m7s
Push Workflows / test (push) Successful in 3m53s
Push Workflows / leptos-test (push) Successful in 3m51s
Push Workflows / build (push) Successful in 5m5s
Push Workflows / nix-build (push) Successful in 7m28s
Push Workflows / docker-build (push) Successful in 9m53s
Reviewed-on: #259
2025-10-19 01:47:08 +00:00
cf9da6812a Bump libretunes_macro in flake
All checks were successful
Push Workflows / rustfmt (push) Successful in 16s
Push Workflows / mdbook (push) Successful in 26s
Push Workflows / docs (push) Successful in 1m48s
Push Workflows / clippy (push) Successful in 1m45s
Push Workflows / mdbook-server (push) Successful in 1m39s
Push Workflows / test (push) Successful in 3m0s
Push Workflows / leptos-test (push) Successful in 3m24s
Push Workflows / build (push) Successful in 4m38s
Push Workflows / nix-build (push) Successful in 7m13s
Push Workflows / docker-build (push) Successful in 8m51s
2025-10-18 20:01:47 -04:00
c81b11c739 Bump libretunes_macro
Some checks failed
Push Workflows / rustfmt (push) Successful in 13s
Push Workflows / mdbook (push) Successful in 34s
Push Workflows / mdbook-server (push) Successful in 52s
Push Workflows / nix-build (push) Failing after 1m15s
Push Workflows / docs (push) Successful in 1m22s
Push Workflows / clippy (push) Successful in 1m23s
Push Workflows / test (push) Successful in 2m32s
Push Workflows / leptos-test (push) Successful in 3m3s
Push Workflows / build (push) Successful in 3m52s
Push Workflows / docker-build (push) Successful in 9m14s
2025-10-18 19:53:45 -04:00
625402c0b6 Merge pull request 'Remove clap and openssl' (#257) from 256-remove-unused-dependencies into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 44s
Push Workflows / docker-build (push) Successful in 57s
Push Workflows / mdbook (push) Successful in 44s
Push Workflows / clippy (push) Successful in 1m35s
Push Workflows / docs (push) Successful in 1m40s
Push Workflows / mdbook-server (push) Successful in 1m45s
Push Workflows / leptos-test (push) Successful in 2m0s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Reviewed-on: #257
2025-10-18 22:52:50 +00:00
cbf07e6865 Remove clap and openssl
All checks were successful
Push Workflows / nix-build (push) Successful in 8m21s
Push Workflows / docker-build (push) Successful in 11m28s
Push Workflows / rustfmt (push) Successful in 26s
Push Workflows / mdbook (push) Successful in 35s
Push Workflows / clippy (push) Successful in 1m34s
Push Workflows / docs (push) Successful in 1m41s
Push Workflows / mdbook-server (push) Successful in 32s
Push Workflows / test (push) Successful in 3m15s
Push Workflows / leptos-test (push) Successful in 3m12s
Push Workflows / build (push) Successful in 4m49s
2025-10-18 18:45:43 -04:00
cb6fbc26b5 Merge pull request 'Serve precompressed files' (#255) from 253-serve-precompressed-files into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 40s
Push Workflows / mdbook (push) Successful in 44s
Push Workflows / mdbook-server (push) Successful in 30s
Push Workflows / nix-build (push) Successful in 7m19s
Push Workflows / docs (push) Successful in 1m20s
Push Workflows / leptos-test (push) Successful in 2m33s
Push Workflows / test (push) Successful in 2m35s
Push Workflows / clippy (push) Successful in 1m21s
Push Workflows / docker-build (push) Successful in 44s
Push Workflows / build (push) Successful in 3m30s
Reviewed-on: #255
2025-10-18 22:41:14 +00:00
fd92cd55dd Remove unused fileserv module
All checks were successful
Push Workflows / rustfmt (push) Successful in 20s
Push Workflows / mdbook (push) Successful in 1m6s
Push Workflows / docs (push) Successful in 1m50s
Push Workflows / clippy (push) Successful in 1m49s
Push Workflows / mdbook-server (push) Successful in 2m7s
Push Workflows / leptos-test (push) Successful in 2m59s
Push Workflows / test (push) Successful in 3m2s
Push Workflows / build (push) Successful in 3m58s
Push Workflows / docker-build (push) Successful in 4m28s
Push Workflows / nix-build (push) Successful in 6m43s
2025-10-18 18:30:45 -04:00
bbfea8033c Use leptos_axum file server 2025-10-18 18:30:20 -04:00
71601f2c11 Merge remote-tracking branch 'origin/main' into 225-move-ingest-into-main-project
Some checks failed
Push Workflows / rustfmt (push) Successful in 14s
Push Workflows / mdbook (push) Successful in 14s
Push Workflows / mdbook-server (push) Successful in 29s
Push Workflows / clippy (push) Failing after 53s
Push Workflows / test (push) Failing after 1m2s
Push Workflows / docs (push) Successful in 1m11s
Push Workflows / build (push) Failing after 2m4s
Push Workflows / leptos-test (push) Failing after 1m25s
Push Workflows / nix-build (push) Successful in 6m33s
Push Workflows / docker-build (push) Successful in 7m56s
2025-10-14 12:28:54 -04:00
1164eb6927 Merge pull request 'Update flake, fix docker build' (#252) from 251-update-flake-fix-docker-build into main
All checks were successful
Push Workflows / mdbook (push) Successful in 31s
Push Workflows / docs (push) Successful in 2m6s
Push Workflows / clippy (push) Successful in 2m13s
Push Workflows / mdbook-server (push) Successful in 2m16s
Push Workflows / leptos-test (push) Successful in 3m49s
Push Workflows / test (push) Successful in 3m52s
Push Workflows / build (push) Successful in 5m0s
Push Workflows / nix-build (push) Successful in 8m11s
Push Workflows / docker-build (push) Successful in 8m22s
Push Workflows / rustfmt (push) Successful in 23s
Reviewed-on: #252
2025-10-14 16:24:12 +00:00
0fba4e6c04 Fix SongListInner funciton doesn't exist
All checks were successful
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 11s
Push Workflows / mdbook-server (push) Successful in 44s
Push Workflows / docs (push) Successful in 4m51s
Push Workflows / clippy (push) Successful in 6m8s
Push Workflows / test (push) Successful in 8m47s
Push Workflows / leptos-test (push) Successful in 9m59s
Push Workflows / build (push) Successful in 10m56s
Push Workflows / nix-build (push) Successful in 11m13s
Push Workflows / docker-build (push) Successful in 13m16s
2025-10-14 12:10:38 -04:00
35fa3d5743 Allow &'static str in const params 2025-10-14 12:10:21 -04:00
e9a161d673 Update leptos, leptos-use. tachys 2025-10-14 12:08:55 -04:00
af80ae67f6 Update flake 2025-10-14 12:08:25 -04:00
60ae01690a Merge pull request 'Update assets path in Dockerfile' (#250) from 247-move-placeholders-and-favicon-to-different into main
Some checks failed
Push Workflows / leptos-test (push) Failing after 15s
Push Workflows / rustfmt (push) Successful in 14s
Push Workflows / mdbook (push) Successful in 16s
Push Workflows / mdbook-server (push) Successful in 25s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 46s
Push Workflows / test (push) Successful in 2m3s
Push Workflows / build (push) Successful in 3m12s
Push Workflows / docker-build (push) Failing after 3m33s
Push Workflows / nix-build (push) Successful in 6m3s
Reviewed-on: #250
2025-10-14 15:14:34 +00:00
11fee74353 Update assets path in Dockerfile
Some checks failed
Push Workflows / docs (push) Successful in 46s
Push Workflows / clippy (push) Successful in 51s
Push Workflows / leptos-test (push) Successful in 1m57s
Push Workflows / test (push) Successful in 2m0s
Push Workflows / build (push) Successful in 2m55s
Push Workflows / docker-build (push) Failing after 2m59s
Push Workflows / nix-build (push) Successful in 5m47s
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 12s
Push Workflows / mdbook-server (push) Successful in 24s
2025-10-14 11:04:00 -04:00
def7df3621 Merge pull request 'Move LibreTunes files from assets/ to public/' (#249) from 247-move-placeholders-and-favicon-to-different into main
Some checks failed
Push Workflows / docker-build (push) Failing after 18s
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / mdbook-server (push) Successful in 24s
Push Workflows / docs (push) Successful in 38s
Push Workflows / clippy (push) Successful in 42s
Push Workflows / leptos-test (push) Successful in 1m23s
Push Workflows / test (push) Successful in 1m31s
Push Workflows / build (push) Successful in 2m17s
Push Workflows / nix-build (push) Successful in 5m28s
Reviewed-on: #249
2025-10-14 14:50:54 +00:00
084ab07c25 Move LibreTunes files from assets/ to public/
Some checks failed
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / docker-build (push) Failing after 19s
Push Workflows / mdbook (push) Successful in 9s
Push Workflows / mdbook-server (push) Successful in 20s
Push Workflows / docs (push) Successful in 39s
Push Workflows / test (push) Successful in 1m32s
Push Workflows / build (push) Successful in 2m18s
Push Workflows / clippy (push) Successful in 43s
Push Workflows / leptos-test (push) Successful in 1m24s
Push Workflows / nix-build (push) Successful in 5m17s
2025-10-14 10:45:11 -04:00
35b825d350 Merge branch '130-standardize-all-media-images'
Some checks failed
Push Workflows / docs (push) Successful in 1m11s
Push Workflows / rustfmt (push) Successful in 14s
Push Workflows / clippy (push) Successful in 1m23s
Push Workflows / test (push) Successful in 3m26s
Push Workflows / mdbook (push) Successful in 14s
Push Workflows / leptos-test (push) Successful in 3m15s
Push Workflows / mdbook-server (push) Successful in 33s
Push Workflows / build (push) Successful in 4m44s
Push Workflows / docker-build (push) Failing after 4m37s
Push Workflows / nix-build (push) Successful in 6m9s
2025-10-14 10:33:39 -04:00
f3351833c4 Use ServeDir instead of manually created file server
Some checks failed
Push Workflows / mdbook-server (push) Successful in 21s
Push Workflows / docs (push) Successful in 44s
Push Workflows / clippy (push) Successful in 58s
Push Workflows / leptos-test (push) Successful in 2m1s
Push Workflows / test (push) Successful in 2m12s
Push Workflows / build (push) Successful in 3m14s
Push Workflows / docker-build (push) Failing after 4m18s
Push Workflows / nix-build (push) Successful in 11m25s
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 10s
Fixes serving assets with '/' in filename
2025-10-14 10:24:53 -04:00
31a249b728 Change config paths to PathBuf instead of String 2025-10-14 10:20:01 -04:00
7ca2d8ad28 Merge pull request 'Enable hashing js/wasm/css in Leptos' (#246) from 245-enable-hashing-jswasmcss-in-leptos into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 23s
Push Workflows / mdbook (push) Successful in 21s
Push Workflows / mdbook-server (push) Successful in 41s
Push Workflows / clippy (push) Successful in 1m42s
Push Workflows / docs (push) Successful in 1m49s
Push Workflows / leptos-test (push) Successful in 3m1s
Push Workflows / test (push) Successful in 3m3s
Push Workflows / build (push) Successful in 4m20s
Push Workflows / nix-build (push) Successful in 7m48s
Push Workflows / docker-build (push) Failing after 4m23s
Reviewed-on: #246
2025-10-14 03:43:09 +00:00
e35b72954c Enable hashing for Leptos
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 9s
Push Workflows / mdbook-server (push) Successful in 26s
Push Workflows / docs (push) Successful in 55s
Push Workflows / clippy (push) Successful in 56s
Push Workflows / leptos-test (push) Successful in 3m2s
Push Workflows / test (push) Successful in 3m6s
Push Workflows / docker-build (push) Failing after 4m19s
Push Workflows / build (push) Successful in 4m22s
Push Workflows / nix-build (push) Successful in 9m32s
Use HashedStylesheet for CSS
2025-10-13 23:38:28 -04:00
104a822633 Merge pull request 'Add compression layer to Router' (#243) from 242-add-compression-to-router into main
Some checks failed
Push Workflows / docs (push) Successful in 57s
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / mdbook-server (push) Successful in 44s
Push Workflows / test (push) Successful in 2m50s
Push Workflows / clippy (push) Successful in 1m17s
Push Workflows / leptos-test (push) Successful in 2m48s
Push Workflows / build (push) Successful in 4m9s
Push Workflows / docker-build (push) Failing after 4m21s
Push Workflows / nix-build (push) Successful in 11m14s
Reviewed-on: #243
2025-10-14 03:32:53 +00:00
98bd544321 Add compression layer to Router
Some checks failed
Push Workflows / rustfmt (push) Successful in 12s
Push Workflows / mdbook (push) Successful in 14s
Push Workflows / mdbook-server (push) Successful in 27s
Push Workflows / docs (push) Successful in 55s
Push Workflows / clippy (push) Successful in 1m10s
Push Workflows / leptos-test (push) Successful in 2m14s
Push Workflows / test (push) Successful in 2m17s
Push Workflows / nix-build (push) Successful in 6m54s
Push Workflows / docker-build (push) Failing after 3m11s
Push Workflows / build (push) Successful in 3m14s
2025-10-13 23:20:26 -04:00
856f66c918 Paths refactoring
Some checks failed
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 22s
Push Workflows / mdbook-server (push) Successful in 4m20s
Push Workflows / docs (push) Successful in 6m25s
Push Workflows / clippy (push) Successful in 8m6s
Push Workflows / test (push) Successful in 11m5s
Push Workflows / docker-build (push) Failing after 12m5s
Push Workflows / leptos-test (push) Successful in 12m20s
Push Workflows / nix-build (push) Successful in 16m42s
Push Workflows / build (push) Successful in 13m12s
Update API and components to use LocalPath/WebPath
Remove img_fallback
Exchange some frontend and backend types
Other path-related changes
Update image/song uploading to use random paths
2025-10-11 19:04:55 -04:00
53e9a7131f Use Path types for models 2025-10-11 18:56:50 -04:00
493dafe9df Remove ImageAssetType 2025-10-11 18:04:14 -04:00
1696f5d8fa Create LocalPath type
Implement diesel ser/de for LocalPath
Move to_web_path functions into LocalPath
2025-10-11 18:04:14 -04:00
53b5d49409 Add WebPath 2025-10-11 18:04:13 -04:00
be9e7ffb3e Add BackendState::get_asset_path 2025-10-11 17:36:44 -04:00
a7181c9315 Move paths module out of ssr-only 2025-10-11 17:36:40 -04:00
df636a1ef2 Add function for path conversion with fallback 2025-10-05 17:16:00 -04:00
70e1c565ad Use path constants for file servers 2025-10-05 17:02:34 -04:00
2053077d79 Add constants for audio and image paths 2025-10-05 17:01:28 -04:00
04ab5f649e Add function to convert local paths to web paths 2025-10-05 16:33:21 -04:00
0d3067ac90 Move AssetType to paths util 2025-10-05 16:32:12 -04:00
1c2715cca5 Add url crate for path manipulation 2025-10-05 16:30:45 -04:00
a9f007f036 Add ImageAssetType and new_image_path 2025-10-05 15:43:57 -04:00
2c44a3d238 Rename random_paths to paths 2025-10-05 15:19:14 -04:00
aa53853a0f Recursively scan audio storage path
Some checks failed
Push Workflows / rustfmt (push) Successful in 13s
Push Workflows / mdbook (push) Successful in 17s
Push Workflows / test (push) Failing after 1m19s
Push Workflows / mdbook-server (push) Successful in 1m7s
Push Workflows / clippy (push) Failing after 1m22s
Push Workflows / docs (push) Successful in 1m27s
Push Workflows / build (push) Failing after 3m1s
Push Workflows / leptos-test (push) Failing after 2m57s
Push Workflows / docker-build (push) Failing after 3m47s
Push Workflows / nix-build (push) Successful in 7m23s
2025-10-04 17:14:32 -04:00
54bb1da10c Pass BackendState to ingest scan 2025-10-04 17:14:15 -04:00
9c6042bd2d Add audiotags error type 2025-10-04 17:11:24 -04:00
b452912b9b Add audiotags crate 2025-10-04 16:56:35 -04:00
ecbfd69257 Set up ingest task
Some checks failed
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / test (push) Failing after 40s
Push Workflows / docker-build (push) Failing after 46s
Push Workflows / clippy (push) Failing after 46s
Push Workflows / mdbook-server (push) Failing after 44s
Push Workflows / docs (push) Successful in 1m9s
Push Workflows / leptos-test (push) Failing after 1m24s
Push Workflows / build (push) Failing after 2m15s
Push Workflows / nix-build (push) Successful in 8m18s
2025-10-03 22:49:59 -04:00
f132d0d7b9 Create ingest module 2025-10-03 21:59:56 -04:00
d2bf32d24f Merge pull request 'Use cargo-leptos from nixpkgs' (#238) from 237-remove-cargoleptos-manual-installation-in-flake into main
Some checks failed
Push Workflows / docs (push) Successful in 2m51s
Push Workflows / test (push) Successful in 3m24s
Push Workflows / leptos-test (push) Successful in 4m21s
Push Workflows / docker-build (push) Failing after 4m29s
Push Workflows / build (push) Successful in 5m17s
Push Workflows / nix-build (push) Successful in 8m6s
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / mdbook-server (push) Successful in 40s
Push Workflows / clippy (push) Successful in 1m47s
Reviewed-on: #238
2025-09-27 23:52:10 +00:00
0ae50784ea Use cargo-leptos from nixpkgs
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 12s
Push Workflows / mdbook-server (push) Successful in 28s
Push Workflows / docs (push) Successful in 50s
Push Workflows / clippy (push) Successful in 53s
Push Workflows / leptos-test (push) Successful in 1m37s
Push Workflows / nix-build (push) Successful in 6m33s
Push Workflows / test (push) Successful in 1m50s
Push Workflows / build (push) Successful in 3m4s
Push Workflows / docker-build (push) Failing after 3m23s
2025-09-27 19:33:16 -04:00
cf1e976cd2 Merge pull request 'Download cargo-leptos from GitHub releases' (#235) from 234-fix-cargoleptos-build-hang-inside-docker into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 13s
Push Workflows / mdbook-server (push) Successful in 26s
Push Workflows / docs (push) Successful in 48s
Push Workflows / clippy (push) Successful in 53s
Push Workflows / leptos-test (push) Successful in 1m37s
Push Workflows / test (push) Successful in 1m55s
Push Workflows / build (push) Successful in 2m50s
Push Workflows / docker-build (push) Failing after 2m57s
Push Workflows / nix-build (push) Successful in 9m33s
Reviewed-on: #235
2025-09-12 15:07:22 +00:00
d9d73211b4 Download cargo-leptos from GitHub releases
All checks were successful
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 9s
Push Workflows / docs (push) Successful in 46s
Push Workflows / clippy (push) Successful in 53s
Push Workflows / mdbook-server (push) Successful in 1m14s
Push Workflows / test (push) Successful in 2m0s
Push Workflows / leptos-test (push) Successful in 1m45s
Push Workflows / build (push) Successful in 2m50s
Push Workflows / docker-build (push) Successful in 8m1s
Push Workflows / nix-build (push) Successful in 8m42s
2025-09-12 10:57:18 -04:00
2470eace33 Merge pull request 'Build with release mode in nix' (#233) from 232-build-with-release-mode-in-nix into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 12s
Push Workflows / mdbook (push) Successful in 17s
Push Workflows / mdbook-server (push) Successful in 4m17s
Push Workflows / docs (push) Successful in 5m49s
Push Workflows / clippy (push) Successful in 8m46s
Push Workflows / test (push) Successful in 12m17s
Push Workflows / leptos-test (push) Successful in 13m30s
Push Workflows / build (push) Successful in 14m15s
Push Workflows / nix-build (push) Successful in 17m37s
Push Workflows / docker-build (push) Successful in 18m49s
Reviewed-on: #233
2025-09-02 20:36:51 +00:00
a09ba01c67 Build with --release in flake
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 19s
Push Workflows / mdbook-server (push) Successful in 50s
Push Workflows / docs (push) Successful in 1m42s
Push Workflows / clippy (push) Successful in 7m50s
Push Workflows / test (push) Successful in 10m40s
Push Workflows / leptos-test (push) Successful in 11m34s
Push Workflows / build (push) Successful in 12m18s
Push Workflows / nix-build (push) Successful in 15m12s
Push Workflows / docker-build (push) Has been cancelled
2025-09-02 14:51:50 -04:00
4d1dde3893 Add binaryen package for wasm-opt for release builds 2025-09-02 14:51:05 -04:00
821a318550 Merge pull request 'Fix style not applying to SongList play/pause icon' (#231) from 230-fix-placement-of-songlist-playpause-button into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / mdbook (push) Successful in 16s
Push Workflows / docs (push) Successful in 4m42s
Push Workflows / test (push) Has started running
Push Workflows / clippy (push) Has started running
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / mdbook-server (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Reviewed-on: #231
2025-06-29 19:34:58 +00:00
953b65e0cb Fix style not applying to SongList play/pause icon
Some checks failed
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / mdbook-server (push) Successful in 22s
Push Workflows / docs (push) Successful in 46s
Push Workflows / clippy (push) Successful in 1m1s
Push Workflows / leptos-test (push) Successful in 1m58s
Push Workflows / docker-build (push) Failing after 2m8s
Push Workflows / test (push) Successful in 2m14s
Push Workflows / build (push) Successful in 2m59s
Push Workflows / nix-build (push) Successful in 9m44s
This appeared to be caused by the reactive if statement
This may be a bug in the reactive system? Did not investigate too much because the new version is fine
2025-06-29 18:01:02 +00:00
a7cfcfe133 Merge remote-tracking branch 'origin/main' into 130-standardize-all-media-images 2025-06-28 01:16:40 +00:00
26f6bbb28a Merge pull request '228-create-unified-config-system' (#229) from 228-create-unified-config-system into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / mdbook-server (push) Successful in 19s
Push Workflows / docs (push) Successful in 35s
Push Workflows / clippy (push) Successful in 44s
Push Workflows / leptos-test (push) Successful in 1m42s
Push Workflows / docker-build (push) Failing after 1m51s
Push Workflows / test (push) Successful in 1m54s
Push Workflows / build (push) Successful in 2m38s
Push Workflows / nix-build (push) Successful in 11m4s
Reviewed-on: #229
2025-06-28 01:13:22 +00:00
7af02947d9 Remove lazy_static dependency
All checks were successful
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / mdbook (push) Successful in 11s
Push Workflows / mdbook-server (push) Successful in 31s
Push Workflows / docs (push) Successful in 1m14s
Push Workflows / clippy (push) Successful in 1m29s
Push Workflows / leptos-test (push) Successful in 2m54s
Push Workflows / test (push) Successful in 3m9s
Push Workflows / build (push) Successful in 4m18s
Push Workflows / docker-build (push) Successful in 10m56s
Push Workflows / nix-build (push) Successful in 10m49s
Not required after BackendState implementation
2025-06-28 01:01:29 +00:00
cd1fff8a10 Remove unused imports
Some checks failed
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / mdbook (push) Successful in 9s
Push Workflows / docs (push) Successful in 50s
Push Workflows / clippy (push) Successful in 1m8s
Push Workflows / mdbook-server (push) Successful in 1m42s
Push Workflows / leptos-test (push) Successful in 2m1s
Push Workflows / test (push) Successful in 2m22s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
2025-06-28 00:58:07 +00:00
4b5b1209a5 Remove FromRequestParts for Config
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 8s
Push Workflows / clippy (push) Failing after 34s
Push Workflows / docs (push) Successful in 42s
Push Workflows / build (push) Failing after 1m22s
Push Workflows / mdbook-server (push) Successful in 1m12s
Push Workflows / leptos-test (push) Successful in 1m30s
Push Workflows / test (push) Successful in 1m44s
Push Workflows / docker-build (push) Successful in 4m20s
Push Workflows / nix-build (push) Successful in 12m53s
2025-06-28 00:49:46 +00:00
83b56b9110 Move PG types back to util::database
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 8s
Push Workflows / docs (push) Successful in 45s
Push Workflows / clippy (push) Successful in 50s
Push Workflows / mdbook-server (push) Successful in 1m18s
Push Workflows / leptos-test (push) Successful in 1m35s
Push Workflows / test (push) Successful in 1m48s
Push Workflows / build (push) Successful in 2m40s
Push Workflows / docker-build (push) Successful in 4m26s
Push Workflows / nix-build (push) Has been cancelled
2025-06-28 00:44:14 +00:00
36de234630 Add backend_state to AuthBackend 2025-06-28 00:41:10 +00:00
b43f9fae8c Add db_conn argument for functions required for auth 2025-06-28 00:39:00 +00:00
97f435c6d8 Implement Error for BackendError 2025-06-28 00:38:17 +00:00
7a79904aa4 Remove old database init/connection functions 2025-06-27 22:08:14 +00:00
7ddbee724b Use BackendState::get_db_conn() instead of database module 2025-06-27 22:08:14 +00:00
912c3b8adf Remove redis module
All functionality is now handled in BackendState
2025-06-27 22:08:14 +00:00
735f6758d7 Use BackendState for redis connection 2025-06-27 22:08:14 +00:00
e25f6ff5c4 Add function to BackendState to extract 2025-06-27 22:08:13 +00:00
8adefabc2f Initialize backend state, supply to requests instead of config 2025-06-27 22:08:13 +00:00
2795a1b754 Add BackendState 2025-06-27 22:08:13 +00:00
f25ebb85d2 Fix leptos tests using new errors
All checks were successful
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / docs (push) Successful in 40s
Push Workflows / clippy (push) Successful in 1m4s
Push Workflows / mdbook-server (push) Successful in 2m21s
Push Workflows / test (push) Successful in 2m56s
Push Workflows / build (push) Successful in 3m56s
Push Workflows / leptos-test (push) Successful in 5m51s
Push Workflows / nix-build (push) Successful in 11m20s
Push Workflows / docker-build (push) Successful in 11m49s
2025-06-27 00:56:18 +00:00
368f673fd7 Rewrite error handling and display
Some checks failed
Push Workflows / rustfmt (push) Successful in 11s
Push Workflows / mdbook (push) Successful in 16s
Push Workflows / mdbook-server (push) Successful in 4m20s
Push Workflows / docs (push) Successful in 5m44s
Push Workflows / clippy (push) Successful in 7m48s
Push Workflows / test (push) Successful in 11m14s
Push Workflows / leptos-test (push) Failing after 11m33s
Push Workflows / build (push) Successful in 12m53s
Push Workflows / nix-build (push) Successful in 16m54s
Push Workflows / docker-build (push) Successful in 17m40s
2025-06-26 00:01:49 +00:00
0541b77b66 Add more BackendError types
Remove "Error" from enum variant names
Add functions to create BackendErrors
Change Contextualize types
Implement conversion from all sub error types
2025-06-26 00:01:35 +00:00
f8a774f389 Add BackendResult type 2025-06-25 23:59:34 +00:00
a6d57a84ce Remove unnecesary feature 2025-06-25 23:59:11 +00:00
cd39ec7252 Create a type for different backend errors 2025-06-16 15:27:08 +00:00
9181b12c01 Load Config at start of main, supply to requests with middleware 2025-06-16 15:27:08 +00:00
40909bfdb0 Implement FromRequestParts for Config 2025-06-10 02:07:34 +00:00
c7154f5008 Update CustomClient for websocket support 2025-06-10 01:37:47 +00:00
60f82fbc74 Add tokio-tungstenite crate 2025-06-10 01:30:52 +00:00
13111e3567 Fix axum route syntax 2025-06-08 21:14:52 +00:00
627746c0b3 Update leptos-use 2025-06-08 21:14:26 +00:00
a738341c5f Update axum-login 2025-06-08 21:13:54 +00:00
5cf2d5e6d9 Update tower-sessions-redis-store 2025-06-08 21:13:28 +00:00
20e2b03b14 Update axum 2025-06-08 21:13:05 +00:00
306d760f06 Update leptos_icons 2025-06-08 21:12:32 +00:00
7cec212c32 Update leptos crates 2025-06-08 20:34:16 +00:00
5cbeba5dbe Add config module 2025-06-08 20:10:14 +00:00
f43013a568 Add clap crate 2025-06-08 18:07:19 +00:00
04cbc7bd4b Add function to generate random paths
All checks were successful
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 49s
Push Workflows / mdbook-server (push) Successful in 1m15s
Push Workflows / leptos-test (push) Successful in 1m28s
Push Workflows / test (push) Successful in 1m41s
Push Workflows / build (push) Successful in 2m46s
Push Workflows / docker-build (push) Successful in 6m1s
Push Workflows / nix-build (push) Successful in 13m8s
2025-06-07 19:41:27 +00:00
432cb659db Add rand crate 2025-06-07 17:36:11 +00:00
33dc7cb1a3 Add image_path to playlist model
All checks were successful
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 49s
Push Workflows / mdbook-server (push) Successful in 1m14s
Push Workflows / leptos-test (push) Successful in 1m27s
Push Workflows / test (push) Successful in 1m38s
Push Workflows / build (push) Successful in 2m28s
Push Workflows / docker-build (push) Successful in 4m16s
Push Workflows / nix-build (push) Successful in 8m55s
2025-06-07 17:15:07 +00:00
92e13dfba7 Add image_path column to playlists table 2025-06-07 15:17:15 +00:00
f6211cbe2e Add image_path to user model 2025-06-07 15:12:52 +00:00
4a092fc78d Add image_path column to users table 2025-06-07 15:12:52 +00:00
d472a663ff Add image_path to artists model 2025-06-07 15:12:51 +00:00
8cad1816a7 Add image_path column to artists table 2025-06-07 14:53:15 +00:00
deaef81999 Fix lints
All checks were successful
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / docs (push) Successful in 45s
Push Workflows / mdbook (push) Successful in 10s
Push Workflows / clippy (push) Successful in 1m3s
Push Workflows / mdbook-server (push) Successful in 42s
Push Workflows / leptos-test (push) Successful in 2m33s
Push Workflows / test (push) Successful in 3m8s
Push Workflows / build (push) Successful in 4m34s
Push Workflows / docker-build (push) Successful in 11m49s
Push Workflows / nix-build (push) Successful in 11m57s
2025-05-30 17:03:43 +00:00
7d2375698c Merge pull request 'mdbook' (#227) from mdbook into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / mdbook (push) Successful in 7s
Push Workflows / docs (push) Successful in 46s
Push Workflows / mdbook-server (push) Successful in 35s
Push Workflows / leptos-test (push) Successful in 2m16s
Push Workflows / clippy (push) Failing after 2m39s
Push Workflows / test (push) Successful in 2m48s
Push Workflows / build (push) Successful in 3m34s
Push Workflows / docker-build (push) Successful in 11m34s
Push Workflows / nix-build (push) Successful in 12m6s
Reviewed-on: #227
2025-05-28 03:07:17 +00:00
84451e2dac Set path to mdbook Dockerfile
Some checks failed
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / mdbook (push) Successful in 5s
Push Workflows / docker-build (push) Successful in 27s
Push Workflows / docs (push) Successful in 36s
Push Workflows / leptos-test (push) Successful in 1m48s
Push Workflows / test (push) Successful in 1m58s
Push Workflows / mdbook-server (push) Successful in 1m45s
Push Workflows / clippy (push) Failing after 1m58s
Push Workflows / build (push) Successful in 2m45s
Push Workflows / nix-build (push) Successful in 13m16s
2025-05-28 02:57:38 +00:00
579e7bbb48 Add CICD job to build mdbook
Some checks failed
Push Workflows / rustfmt (push) Successful in 44s
Push Workflows / mdbook (push) Successful in 1m28s
Push Workflows / docs (push) Successful in 7m51s
Push Workflows / clippy (push) Failing after 8m28s
Push Workflows / test (push) Successful in 13m15s
Push Workflows / leptos-test (push) Successful in 14m20s
Push Workflows / build (push) Successful in 15m5s
Push Workflows / docker-build (push) Successful in 18m32s
Push Workflows / mdbook-server (push) Successful in 18m37s
Push Workflows / nix-build (push) Successful in 19m9s
2025-05-28 02:40:21 +00:00
6e57dfc937 Add CICD job to build mdbook server 2025-05-28 02:40:11 +00:00
85d622fdb6 Add Dockerfile for mdbook 2025-05-28 02:29:04 +00:00
6a6bbfe8ed Add mdbook to flake 2025-05-28 01:54:24 +00:00
3803c20049 Initialize mdbook 2025-05-28 01:53:45 +00:00
544476d1ee Fix missing libgomp library in flake
All checks were successful
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / docker-build (push) Successful in 24s
Push Workflows / docs (push) Successful in 49s
Push Workflows / clippy (push) Successful in 49s
Push Workflows / leptos-test (push) Successful in 1m38s
Push Workflows / test (push) Successful in 1m44s
Push Workflows / build (push) Successful in 2m30s
Push Workflows / nix-build (push) Successful in 8m30s
2025-05-06 05:31:05 +00:00
1878f1feda Merge pull request 'Add liked songs page' (#224) from 222-add-liked-songs-page into main
All checks were successful
Push Workflows / rustfmt (push) Successful in 7s
Push Workflows / docs (push) Successful in 49s
Push Workflows / clippy (push) Successful in 47s
Push Workflows / leptos-test (push) Successful in 1m48s
Push Workflows / test (push) Successful in 2m7s
Push Workflows / nix-build (push) Successful in 12m26s
Push Workflows / build (push) Successful in 2m19s
Push Workflows / docker-build (push) Successful in 20s
Reviewed-on: #224
2025-05-06 03:31:04 +00:00
b727137fa8 Show liked songs at top of playlists
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / docs (push) Successful in 54s
Push Workflows / clippy (push) Successful in 54s
Push Workflows / leptos-test (push) Successful in 2m51s
Push Workflows / test (push) Successful in 3m9s
Push Workflows / build (push) Successful in 4m25s
Push Workflows / docker-build (push) Failing after 12m59s
Push Workflows / nix-build (push) Successful in 16m11s
2025-05-06 03:17:47 +00:00
f61507b197 Add liked songs page 2025-05-06 03:17:35 +00:00
d2aebde562 Add API endpoint to get liked songs 2025-05-06 03:14:57 +00:00
0076f4f208 Merge pull request 'Fix typing space in add album/artist/song dialog' (#223) from 180-fix-typing-space into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 37s
Push Workflows / leptos-test (push) Successful in 1m21s
Push Workflows / test (push) Successful in 1m50s
Push Workflows / build (push) Successful in 2m53s
Push Workflows / docker-build (push) Failing after 13m21s
Push Workflows / nix-build (push) Successful in 16m37s
Reviewed-on: #223
2025-05-06 03:14:00 +00:00
ba0a531f2c Ignore space and arrow key events on input fields
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 36s
Push Workflows / leptos-test (push) Successful in 1m29s
Push Workflows / test (push) Successful in 1m50s
Push Workflows / build (push) Successful in 2m47s
Push Workflows / docker-build (push) Failing after 9m52s
Push Workflows / nix-build (push) Successful in 12m32s
2025-05-06 02:41:15 +00:00
2617ee8b95 Merge pull request 'Implement Playlists' (#219) from playlists-implementation into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 39s
Push Workflows / clippy (push) Successful in 36s
Push Workflows / leptos-test (push) Successful in 1m14s
Push Workflows / test (push) Successful in 1m38s
Push Workflows / build (push) Successful in 2m35s
Push Workflows / docker-build (push) Failing after 9m57s
Push Workflows / nix-build (push) Successful in 12m20s
Reviewed-on: #219
2025-05-06 02:04:11 +00:00
4d1859b331 Add playlist page
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 41s
Push Workflows / clippy (push) Successful in 37s
Push Workflows / leptos-test (push) Successful in 1m18s
Push Workflows / test (push) Successful in 1m39s
Push Workflows / build (push) Successful in 2m43s
Push Workflows / docker-build (push) Failing after 9m22s
Push Workflows / nix-build (push) Successful in 12m39s
2025-05-06 01:34:53 +00:00
c17aeb3822 Display playlists on sidebar 2025-05-06 01:34:44 +00:00
0e0d107d08 Include playlist resource in global state 2025-05-06 01:34:28 +00:00
463f3b744f Add styling for control-solid 2025-05-06 01:33:59 +00:00
28875c8669 Write playlist API functions 2025-05-06 01:33:21 +00:00
68778615b9 Move extract_field to util 2025-05-06 01:33:20 +00:00
58b5ed6d3f Add image fallback handler 2025-05-06 01:33:20 +00:00
f8c0134cf2 Add trigger to update playlist updated_at 2025-05-05 04:29:24 +00:00
a31539dc8f Merge pull request 'Separate types for inserting and fetching' (#218) from 217-separate-insert-fetch-types into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 9s
Push Workflows / docs (push) Successful in 38s
Push Workflows / clippy (push) Successful in 36s
Push Workflows / leptos-test (push) Successful in 1m13s
Push Workflows / test (push) Successful in 1m30s
Push Workflows / build (push) Successful in 2m32s
Push Workflows / docker-build (push) Failing after 9m20s
Push Workflows / nix-build (push) Successful in 12m12s
Reviewed-on: #218
2025-05-05 02:32:46 +00:00
eda4e42150 Update flake.nix to use libretunes_macro git dependency
Some checks failed
Push Workflows / docs (push) Successful in 1m14s
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / clippy (push) Successful in 1m40s
Push Workflows / leptos-test (push) Successful in 3m0s
Push Workflows / test (push) Successful in 3m11s
Push Workflows / build (push) Successful in 4m6s
Push Workflows / docker-build (push) Failing after 10m51s
Push Workflows / nix-build (push) Successful in 13m11s
2025-05-05 02:08:36 +00:00
54d629d504 Use db_type for User
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / nix-build (push) Failing after 45s
Push Workflows / docs (push) Successful in 3m20s
Push Workflows / clippy (push) Successful in 4m24s
Push Workflows / test (push) Successful in 5m58s
Push Workflows / leptos-test (push) Successful in 6m59s
Push Workflows / build (push) Successful in 7m53s
Push Workflows / docker-build (push) Failing after 11m47s
2025-05-05 01:25:20 +00:00
6486bbbdda Use db_type for Playlist 2025-05-05 01:10:17 +00:00
b727832c8e Use db_type for HistoryEntry 2025-05-05 01:07:07 +00:00
7c4058884e Use db_type for Artist 2025-05-05 01:05:20 +00:00
a67bd37d11 Use db_type for Album 2025-05-05 00:53:57 +00:00
3f43ef2d20 Use libretunes_macro::db_type instead of manual Song/NewSong structs 2025-05-05 00:38:37 +00:00
0b599f4038 Add libretunes_macro dependency 2025-05-05 00:38:01 +00:00
c02363c698 Create NewSong type 2025-05-04 21:36:34 +00:00
9da05edcd4 Merge pull request 'Create Search Bar Component' (#216) from 148-create-search-bar-component-2 into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 4s
Push Workflows / docs (push) Successful in 30s
Push Workflows / clippy (push) Successful in 27s
Push Workflows / leptos-test (push) Successful in 59s
Push Workflows / test (push) Successful in 1m9s
Push Workflows / build (push) Successful in 2m7s
Push Workflows / docker-build (push) Failing after 9m20s
Push Workflows / nix-build (push) Successful in 12m8s
Reviewed-on: #216
2025-05-04 03:50:15 +00:00
f65d054612 Create search page
Some checks failed
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / clippy (push) Successful in 27s
Push Workflows / docs (push) Successful in 33s
Push Workflows / leptos-test (push) Successful in 1m8s
Push Workflows / test (push) Successful in 1m19s
Push Workflows / build (push) Successful in 2m26s
Push Workflows / docker-build (push) Failing after 9m13s
Push Workflows / nix-build (push) Successful in 12m18s
2025-05-04 03:35:12 +00:00
16cf406990 Merge pull request 'Return query match score for all search API results' (#215) from 26-return-query-match-score into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / clippy (push) Successful in 40s
Push Workflows / docs (push) Successful in 49s
Push Workflows / leptos-test (push) Successful in 1m20s
Push Workflows / test (push) Successful in 1m39s
Push Workflows / build (push) Successful in 3m13s
Push Workflows / docker-build (push) Failing after 11m34s
Push Workflows / nix-build (push) Successful in 13m59s
Reviewed-on: #215
2025-05-03 18:28:40 +00:00
ed6cd4efcf Return query match score for all search results
Some checks failed
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / clippy (push) Successful in 28s
Push Workflows / docs (push) Successful in 33s
Push Workflows / leptos-test (push) Successful in 1m1s
Push Workflows / test (push) Successful in 1m9s
Push Workflows / build (push) Successful in 2m3s
Push Workflows / docker-build (push) Failing after 12m0s
Push Workflows / nix-build (push) Successful in 15m20s
Add SearchRersult<T> type
Apply temporary fixes to upload page
2025-05-03 18:24:30 +00:00
4d24a9bba2 Merge pull request 'Create health check binary' (#214) from 207-create-health-check-binary into main
Some checks failed
Push Workflows / docs (push) Successful in 50s
Push Workflows / clippy (push) Successful in 57s
Push Workflows / rustfmt (push) Successful in 12s
Push Workflows / leptos-test (push) Successful in 2m14s
Push Workflows / test (push) Successful in 2m30s
Push Workflows / build (push) Successful in 3m47s
Push Workflows / docker-build (push) Failing after 11m24s
Push Workflows / nix-build (push) Successful in 13m52s
Reviewed-on: #214
2025-05-03 06:32:09 +00:00
11cb502f53 Add healthcheck to Docker image
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 37s
Push Workflows / clippy (push) Successful in 33s
Push Workflows / leptos-test (push) Successful in 1m10s
Push Workflows / test (push) Successful in 1m21s
Push Workflows / build (push) Successful in 2m39s
Push Workflows / docker-build (push) Failing after 12m51s
Push Workflows / nix-build (push) Successful in 15m52s
2025-05-03 06:27:38 +00:00
0ec9e5ed03 Include health_check binary in Docker image 2025-05-03 06:27:38 +00:00
7bccde7654 Create health check bin 2025-05-03 06:27:38 +00:00
bd69c46567 Merge pull request 'Create health check endpoint' (#213) from 206-create-health-check-endpoint into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 8s
Push Workflows / docs (push) Successful in 43s
Push Workflows / clippy (push) Successful in 40s
Push Workflows / leptos-test (push) Successful in 1m36s
Push Workflows / test (push) Successful in 2m6s
Push Workflows / build (push) Successful in 3m35s
Push Workflows / docker-build (push) Failing after 10m49s
Push Workflows / nix-build (push) Successful in 14m0s
Reviewed-on: #213
2025-05-03 05:34:04 +00:00
e2a395ae7c Don't require auth for health check endpoint
Some checks failed
Push Workflows / rustfmt (push) Successful in 5s
Push Workflows / docs (push) Successful in 32s
Push Workflows / clippy (push) Successful in 28s
Push Workflows / leptos-test (push) Successful in 59s
Push Workflows / test (push) Successful in 1m7s
Push Workflows / build (push) Successful in 2m6s
Push Workflows / docker-build (push) Failing after 11m21s
Push Workflows / nix-build (push) Successful in 15m21s
2025-05-03 05:30:47 +00:00
6bb6322aa4 Create health check endpoint 2025-05-03 05:30:34 +00:00
a82da927b0 Move redis connection to util/redis.rs 2025-05-03 05:30:17 +00:00
6f571a338f Merge pull request 'Allow server function calls from non-WASM' (#212) from 211-allow-server-function-calls-from-non-wasm into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 6s
Push Workflows / docs (push) Successful in 36s
Push Workflows / clippy (push) Successful in 33s
Push Workflows / leptos-test (push) Successful in 1m9s
Push Workflows / test (push) Successful in 1m20s
Push Workflows / build (push) Successful in 2m33s
Push Workflows / docker-build (push) Failing after 8m53s
Push Workflows / nix-build (push) Successful in 12m3s
Reviewed-on: #212
2025-05-03 04:38:01 +00:00
9cd1e8291a Use feature-specific Client in server functions
Some checks failed
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / docs (push) Successful in 1m43s
Push Workflows / clippy (push) Successful in 2m12s
Push Workflows / test (push) Successful in 4m17s
Push Workflows / leptos-test (push) Successful in 4m24s
Push Workflows / build (push) Successful in 6m2s
Push Workflows / docker-build (push) Failing after 12m7s
Push Workflows / nix-build (push) Successful in 15m46s
2025-05-03 04:24:28 +00:00
ff1b7401f2 Add custom client for non-browser RPC 2025-05-03 03:51:55 +00:00
d434a514a4 Create reqwest_api feature 2025-05-03 03:25:33 +00:00
10011a8859 Merge pull request '6-general-src-restructuring' (#210) from 6-general-src-restructuring into main
Some checks failed
Push Workflows / rustfmt (push) Successful in 10s
Push Workflows / clippy (push) Successful in 33s
Push Workflows / docs (push) Successful in 39s
Push Workflows / leptos-test (push) Successful in 1m17s
Push Workflows / test (push) Successful in 1m30s
Push Workflows / build (push) Successful in 2m41s
Push Workflows / docker-build (push) Failing after 9m20s
Push Workflows / nix-build (push) Successful in 12m26s
Reviewed-on: #210
2025-04-29 23:14:30 +00:00
16bc79aef4 Fix redundant closures (Clippy)
Some checks failed
Push Workflows / rustfmt (push) Successful in 4s
Push Workflows / docs (push) Successful in 29s
Push Workflows / clippy (push) Successful in 25s
Push Workflows / leptos-test (push) Successful in 54s
Push Workflows / test (push) Successful in 1m2s
Push Workflows / build (push) Successful in 1m52s
Push Workflows / docker-build (push) Failing after 8m18s
Push Workflows / nix-build (push) Successful in 13m53s
2025-04-29 22:41:47 +00:00
297c22d832 Run rustfmt
Some checks failed
Push Workflows / docs (push) Successful in 1m35s
Push Workflows / rustfmt (push) Successful in 11s
Push Workflows / clippy (push) Failing after 58s
Push Workflows / leptos-test (push) Successful in 3m4s
Push Workflows / test (push) Successful in 3m22s
Push Workflows / build (push) Successful in 4m43s
Push Workflows / docker-build (push) Failing after 14m42s
Push Workflows / nix-build (push) Successful in 17m22s
2025-04-29 22:27:24 +00:00
5fb84bd29e Add rustfmt CICD job
Some checks failed
Push Workflows / rustfmt (push) Failing after 5s
Push Workflows / clippy (push) Successful in 32s
Push Workflows / docs (push) Successful in 38s
Push Workflows / leptos-test (push) Successful in 1m40s
Push Workflows / test (push) Successful in 2m13s
Push Workflows / build (push) Successful in 4m7s
Push Workflows / docker-build (push) Failing after 15m19s
Push Workflows / nix-build (push) Successful in 18m33s
2025-04-29 22:25:44 +00:00
b9cbe22562 Use contains instead of .iter().any() (Clippy)
Some checks failed
Push Workflows / test (push) Successful in 49s
Push Workflows / build (push) Successful in 1m39s
Push Workflows / leptos-test (push) Successful in 1m15s
Push Workflows / docs (push) Successful in 37s
Push Workflows / clippy (push) Successful in 2m1s
Push Workflows / docker-build (push) Failing after 8m53s
Push Workflows / nix-build (push) Successful in 11m19s
2025-04-29 21:33:01 +00:00
aaec0523a4 Combine else + if into else if (Clippy) 2025-04-29 21:32:47 +00:00
ff8fd283b6 Fix Clippy format string lints 2025-04-29 21:32:18 +00:00
cf35961516 Automatically start playing in skip_to 2025-04-29 21:29:15 +00:00
976790342c Call audio play/pause when PlayStatus playing updated 2025-04-29 21:26:22 +00:00
29534a473b Rename songpage.rs to song.rs
Some checks failed
Push Workflows / build (push) Successful in 1m51s
Push Workflows / test (push) Successful in 3m51s
Push Workflows / leptos-test (push) Successful in 52s
Push Workflows / docs (push) Successful in 24s
Push Workflows / clippy (push) Failing after 1m53s
Push Workflows / docker-build (push) Failing after 9m31s
Push Workflows / nix-build (push) Successful in 10m55s
2025-04-29 20:51:09 +00:00
673d6e7651 Rename albumpage.rs to album.rs 2025-04-29 20:49:46 +00:00
2def629dc1 Update song page to use TailwindCSS
Some checks failed
Push Workflows / build (push) Successful in 1m50s
Push Workflows / leptos-test (push) Successful in 1m12s
Push Workflows / docs (push) Successful in 28s
Push Workflows / clippy (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
2025-04-29 20:48:22 +00:00
f1126c2534 Update flake 2025-04-29 20:36:39 +00:00
c593501572 Refactor profile page to use TailwindCSS 2025-04-29 20:09:28 +00:00
797fea93b2 Fix artist image path in profile API 2025-04-29 20:02:37 +00:00
c6e6eb1f41 Increase recursion limit in lib target 2025-04-29 19:57:27 +00:00
b963849072 Move Queue before PlayBar
Some checks failed
Push Workflows / test (push) Failing after 4m24s
Push Workflows / build (push) Successful in 7m24s
Push Workflows / docs (push) Successful in 1m42s
Push Workflows / leptos-test (push) Successful in 5m2s
Push Workflows / docker-build (push) Failing after 10m50s
Push Workflows / clippy (push) Failing after 1m48s
Push Workflows / nix-build (push) Failing after 7m37s
2025-04-29 19:17:55 +00:00
c7ee25c1b4 Close queue when clicked away 2025-04-29 19:17:55 +00:00
6a0f814cd3 Update to leptos 0.7.8
Signed-off-by: Ethan Girouard <ethan@girouard.com>
2025-04-29 19:03:22 +00:00
3027a1f00c Change Dashboard menu to / instead of /dashboard
Some checks failed
Push Workflows / test (push) Failing after 8m8s
Push Workflows / docs (push) Successful in 2m47s
Push Workflows / clippy (push) Successful in 2m18s
Push Workflows / build (push) Successful in 6m50s
Push Workflows / docker-build (push) Failing after 9m17s
Push Workflows / leptos-test (push) Successful in 4m49s
Push Workflows / nix-build (push) Successful in 12m56s
2025-04-05 13:59:47 -04:00
b735b25677 Use tailwindcss for Personal component
Some checks failed
Push Workflows / test (push) Failing after 7m47s
Push Workflows / docker-build (push) Successful in 9m49s
Push Workflows / docs (push) Successful in 2m48s
Push Workflows / build (push) Successful in 15m54s
Push Workflows / clippy (push) Successful in 2m11s
Push Workflows / leptos-test (push) Successful in 12m24s
Push Workflows / nix-build (push) Successful in 31m57s
Close login/out buttons when clicking away
2025-03-26 17:09:59 -04:00
0bb8871296 Use tailwindcss for Error component 2025-03-26 14:00:23 -04:00
388ef55552 Refactor album page
Use tailwindcss classes
Refactor API endpoints
2025-03-26 13:53:44 -04:00
fae4767313 Remove white background from loading indicator 2025-03-26 13:49:45 -04:00
7b5b9fbe15 Move AlbumInfo component into album page
Some checks failed
Push Workflows / build (push) Failing after 20s
Push Workflows / leptos-test (push) Failing after 19s
Push Workflows / docs (push) Successful in 3m27s
Push Workflows / test (push) Failing after 8m33s
Push Workflows / clippy (push) Successful in 4m42s
Push Workflows / docker-build (push) Successful in 27m40s
Push Workflows / nix-build (push) Successful in 44m34s
2025-03-21 22:04:53 -04:00
d0e849bd0e Update artist page to use tailwind
Some checks failed
Push Workflows / build (push) Successful in 6m6s
Push Workflows / test (push) Failing after 7m12s
Push Workflows / docker-build (push) Successful in 8m40s
Push Workflows / docs (push) Successful in 2m22s
Push Workflows / leptos-test (push) Successful in 4m38s
Push Workflows / clippy (push) Successful in 2m23s
Push Workflows / nix-build (push) Successful in 34m40s
2025-03-16 18:08:20 -04:00
ae9243e9f3 Update DashboardRow to use tailwind 2025-03-16 17:40:00 -04:00
f4f6e1e4a6 Prevent main page component from extending beyond page 2025-03-16 16:49:09 -04:00
5a71973388 Remove old dashboard-container and home-component wrappers 2025-03-15 23:17:59 -04:00
c782aa2cd4 Increase recursion limit to fix compile error
Some checks failed
Push Workflows / test (push) Failing after 12m18s
Push Workflows / build (push) Successful in 24m34s
Push Workflows / leptos-test (push) Successful in 19m43s
Push Workflows / docs (push) Successful in 7m49s
Push Workflows / docker-build (push) Successful in 35m51s
Push Workflows / clippy (push) Successful in 7m12s
Push Workflows / nix-build (push) Successful in 44m21s
2025-03-15 23:14:32 -04:00
082c1a2128 Update SongList to use tailwind 2025-03-15 23:12:08 -04:00
6572d7313a Move sidebar navigation to separate module
Some checks failed
Push Workflows / test (push) Failing after 9m50s
Push Workflows / build (push) Failing after 16m49s
Push Workflows / docker-build (push) Failing after 21m36s
Push Workflows / docs (push) Successful in 4m49s
Push Workflows / leptos-test (push) Successful in 14m36s
Push Workflows / clippy (push) Successful in 5m9s
Push Workflows / nix-build (push) Failing after 26m50s
2025-03-07 23:26:11 -05:00
f1862a6bd6 Rename sidebar Bottom component to Playlists 2025-03-07 23:15:56 -05:00
347ad39fae Make sidebar Bottom component a home-card 2025-03-07 23:14:48 -05:00
318892adc1 Make Personal component a home-card 2025-03-07 23:11:38 -05:00
1a2b7510f8 Remove "extern crate" used for musl builds 2025-03-07 22:57:04 -05:00
3b452e32d8 Switch home-container div to section, use tailwind 2025-02-18 22:49:55 -05:00
7d410c2419 Add home-card style class 2025-02-18 22:49:29 -05:00
b55104144b Convert playbar to using tailwindcss
All checks were successful
Push Workflows / docs (push) Successful in 1m5s
Push Workflows / clippy (push) Successful in 1m26s
Push Workflows / leptos-test (push) Successful in 2m26s
Push Workflows / test (push) Successful in 2m30s
Push Workflows / build (push) Successful in 4m48s
Push Workflows / docker-build (push) Successful in 7m21s
Push Workflows / nix-build (push) Successful in 25m44s
2025-02-18 19:17:21 -05:00
911d375a95 Rewrite signup page using tailwindcss
All checks were successful
Push Workflows / clippy (push) Successful in 1m14s
Push Workflows / docs (push) Successful in 1m54s
Push Workflows / leptos-test (push) Successful in 2m20s
Push Workflows / test (push) Successful in 2m25s
Push Workflows / build (push) Successful in 5m45s
Push Workflows / docker-build (push) Successful in 7m25s
Push Workflows / nix-build (push) Successful in 24m4s
2025-02-18 15:14:33 -05:00
9b20395876 Rewrite login page using tailwindcss 2025-02-18 15:04:54 -05:00
ea869ce983 Ignore automatically generated but unused tailwind config 2025-02-18 14:59:24 -05:00
b1299ca28c Remove action="POST" from forms with submit handler
All checks were successful
Push Workflows / docs (push) Successful in 9m6s
Push Workflows / docker-build (push) Successful in 10m38s
Push Workflows / test (push) Successful in 12m40s
Push Workflows / clippy (push) Successful in 10m38s
Push Workflows / build (push) Successful in 18m18s
Push Workflows / leptos-test (push) Successful in 19m43s
Push Workflows / nix-build (push) Successful in 30m19s
2025-02-18 00:22:21 -05:00
9dfc556bd0 Ensure cargo-leptos uses tailwindcss v4 in Docker build
Some checks failed
Push Workflows / docs (push) Successful in 1m37s
Push Workflows / clippy (push) Successful in 2m3s
Push Workflows / build (push) Failing after 2m32s
Push Workflows / test (push) Successful in 3m17s
Push Workflows / leptos-test (push) Successful in 3m33s
Push Workflows / docker-build (push) Successful in 19m23s
Push Workflows / nix-build (push) Successful in 26m14s
2025-02-17 23:06:28 -05:00
3a29ce4741 Use new tailwindcss configuration method 2025-02-17 22:38:37 -05:00
7d6c1e66bc Update tailwindcss to v4 2025-02-17 22:37:00 -05:00
478f8362af Update flake 2025-02-17 22:36:09 -05:00
fc8825d765 Update tower-sessions-redis-store and axum-login
All checks were successful
Push Workflows / docs (push) Successful in 8m10s
Push Workflows / clippy (push) Successful in 9m24s
Push Workflows / build (push) Successful in 12m20s
Push Workflows / test (push) Successful in 14m25s
Push Workflows / leptos-test (push) Successful in 16m24s
Push Workflows / docker-build (push) Successful in 22m53s
Push Workflows / nix-build (push) Successful in 32m6s
2025-02-17 20:15:39 -05:00
99fac1fe8f Upgrade leptos to 0.7.7 2025-02-12 21:37:36 -05:00
cfbc84343b Add FancyInput component for forms
Some checks failed
Push Workflows / clippy (push) Successful in 55s
Push Workflows / docs (push) Successful in 1m4s
Push Workflows / leptos-test (push) Successful in 5m46s
Push Workflows / nix-build (push) Successful in 21m58s
Push Workflows / test (push) Successful in 1m37s
Push Workflows / build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
2025-02-12 18:57:52 -05:00
49455f5e03 Convert loading to use Tailwind styling 2025-02-10 17:25:57 -05:00
f482e06076 Switch to Tailwind CSS
All checks were successful
Push Workflows / docs (push) Successful in 48s
Push Workflows / clippy (push) Successful in 58s
Push Workflows / leptos-test (push) Successful in 1m47s
Push Workflows / test (push) Successful in 1m59s
Push Workflows / build (push) Successful in 2m56s
Push Workflows / docker-build (push) Successful in 13m34s
Push Workflows / nix-build (push) Successful in 26m40s
2025-02-07 10:55:22 -05:00
56902f1ff2 Move dashboard to pages
All checks were successful
Push Workflows / clippy (push) Successful in 58s
Push Workflows / leptos-test (push) Successful in 1m28s
Push Workflows / test (push) Successful in 2m2s
Push Workflows / docs (push) Successful in 2m26s
Push Workflows / build (push) Successful in 2m52s
Push Workflows / docker-build (push) Successful in 8m39s
Push Workflows / nix-build (push) Successful in 22m8s
2025-02-06 11:21:12 -05:00
4dfc789f58 Move search to pages 2025-02-06 11:20:22 -05:00
f04ad57a5a Move error_template to components
Some checks failed
Push Workflows / clippy (push) Successful in 1m7s
Push Workflows / docs (push) Successful in 1m17s
Push Workflows / test (push) Successful in 1m39s
Push Workflows / leptos-test (push) Successful in 1m44s
Push Workflows / build (push) Successful in 3m35s
Push Workflows / docker-build (push) Successful in 7m5s
Push Workflows / nix-build (push) Has been cancelled
2025-02-06 11:11:19 -05:00
fac75e1f54 Move auth_backend to util 2025-02-06 11:09:36 -05:00
afd8f014b2 Move users to api 2025-02-06 11:03:47 -05:00
362b8161e3 Fix doc comment indentation
All checks were successful
Push Workflows / leptos-test (push) Successful in 1m42s
Push Workflows / test (push) Successful in 2m6s
Push Workflows / clippy (push) Successful in 2m30s
Push Workflows / build (push) Successful in 6m19s
Push Workflows / docs (push) Successful in 6m26s
Push Workflows / docker-build (push) Successful in 8m44s
Push Workflows / nix-build (push) Successful in 23m52s
2025-02-06 10:22:54 -05:00
0f48dfeada Fix clippy lint errors
Some checks failed
Push Workflows / test (push) Successful in 1m33s
Push Workflows / docs (push) Successful in 1m45s
Push Workflows / clippy (push) Failing after 2m25s
Push Workflows / build (push) Successful in 4m0s
Push Workflows / leptos-test (push) Successful in 6m33s
Push Workflows / docker-build (push) Successful in 7m44s
Push Workflows / nix-build (push) Successful in 19m55s
2025-02-05 22:56:11 -05:00
7a0ae4c028 Configure clippy lints 2025-02-05 22:55:22 -05:00
742f0e2be6 Update doctests to reflect moved modules
Some checks failed
Push Workflows / docs (push) Successful in 1m16s
Push Workflows / leptos-test (push) Successful in 1m59s
Push Workflows / clippy (push) Failing after 2m4s
Push Workflows / test (push) Successful in 2m7s
Push Workflows / build (push) Successful in 2m46s
Push Workflows / docker-build (push) Successful in 7m28s
Push Workflows / nix-build (push) Successful in 25m13s
2025-02-05 22:06:43 -05:00
57d7459976 Add clippy CICD job
Some checks failed
Push Workflows / docker-build (push) Successful in 55s
Push Workflows / docs (push) Successful in 58s
Push Workflows / leptos-test (push) Failing after 1m21s
Push Workflows / clippy (push) Failing after 2m19s
Push Workflows / test (push) Successful in 2m31s
Push Workflows / build (push) Successful in 3m27s
Push Workflows / nix-build (push) Has been cancelled
2025-02-05 21:58:25 -05:00
a83a051d89 Remove unused backend models functions
Some checks failed
Push Workflows / leptos-test (push) Failing after 1m2s
Push Workflows / docs (push) Successful in 1m7s
Push Workflows / test (push) Successful in 1m36s
Push Workflows / build (push) Successful in 5m37s
Push Workflows / docker-build (push) Successful in 7m53s
Push Workflows / nix-build (push) Successful in 18m58s
2025-02-05 21:21:14 -05:00
745d4c1b0a Move auth to api 2025-02-05 21:05:45 -05:00
6666002533 Move upload to api 2025-02-05 21:04:26 -05:00
a33a891d87 Move song to components 2025-02-05 21:04:26 -05:00
9b22a82514 Move search to api 2025-02-05 20:58:00 -05:00
59b9db34cf Move playbar to components 2025-02-05 20:56:07 -05:00
d03eed78e7 Move queue to components 2025-02-05 20:53:25 -05:00
fc64b0cf1c Combine artist albums DB queries into single query 2025-02-05 15:20:03 -05:00
6a52598956 Combine user profile history DB queries into single query 2025-02-05 15:18:32 -05:00
e42247ee84 Move data types into models/frontend and models/backend 2025-02-05 12:34:48 -05:00
d72ed532c1 Move database to util 2025-02-04 22:54:06 -05:00
c3bc042027 Move pages.rs into pages/mod.rs
All checks were successful
Push Workflows / docs (push) Successful in 1m1s
Push Workflows / leptos-test (push) Successful in 1m45s
Push Workflows / test (push) Successful in 2m1s
Push Workflows / build (push) Successful in 4m8s
Push Workflows / docker-build (push) Successful in 18m29s
Push Workflows / nix-build (push) Successful in 20m30s
2025-02-04 22:47:43 -05:00
a67e486f75 Move components.rs into components/mod.rs 2025-02-04 22:47:18 -05:00
841251639e Move fileserv into util 2025-02-04 22:46:39 -05:00
0d2a83f508 Merge pull request 'Update audio source when status is updated' (#204) from 198-update-audio-source-when-status-is into main
All checks were successful
Push Workflows / docs (push) Successful in 1m4s
Push Workflows / docker-build (push) Successful in 1m7s
Push Workflows / leptos-test (push) Successful in 1m28s
Push Workflows / test (push) Successful in 2m6s
Push Workflows / build (push) Successful in 6m36s
Push Workflows / nix-build (push) Successful in 25m2s
Reviewed-on: #204
2025-02-03 03:04:14 +00:00
2d4a9ac9fd Merge branch 'main' into 198-update-audio-source-when-status-is
All checks were successful
Push Workflows / docs (push) Successful in 46s
Push Workflows / test (push) Successful in 1m38s
Push Workflows / leptos-test (push) Successful in 1m46s
Push Workflows / build (push) Successful in 2m52s
Push Workflows / docker-build (push) Successful in 8m31s
Push Workflows / nix-build (push) Successful in 18m23s
2025-02-02 22:01:03 -05:00
be053ffa62 Merge pull request 'Use timestamp instead of date for song added_date' (#203) from 201-use-timestamp-instead-of-date-for into main
All checks were successful
Push Workflows / docs (push) Successful in 44s
Push Workflows / docker-build (push) Successful in 1m17s
Push Workflows / leptos-test (push) Successful in 1m36s
Push Workflows / test (push) Successful in 1m44s
Push Workflows / build (push) Successful in 4m56s
Push Workflows / nix-build (push) Successful in 20m48s
Reviewed-on: #203
2025-02-02 22:17:10 +00:00
e7a8491653 Use timestamp instead of date for added_date column in songs table
All checks were successful
Push Workflows / docs (push) Successful in 56s
Push Workflows / test (push) Successful in 2m23s
Push Workflows / leptos-test (push) Successful in 2m31s
Push Workflows / build (push) Successful in 3m24s
Push Workflows / docker-build (push) Successful in 8m12s
Push Workflows / nix-build (push) Successful in 18m0s
2025-02-02 16:42:24 -05:00
2116dc9058 Merge pull request 'Update to leptos 0.7' (#202) from 171-update-to-leptos-07 into main
All checks were successful
Push Workflows / nix-build (push) Successful in 21m38s
Push Workflows / build (push) Successful in 3m45s
Push Workflows / docker-build (push) Successful in 1m14s
Push Workflows / docs (push) Successful in 2m3s
Push Workflows / leptos-test (push) Successful in 3m51s
Push Workflows / test (push) Successful in 6m32s
Reviewed-on: #202
2025-02-02 21:24:51 +00:00
a093068625 Add --locked to cargo-leptos install command
All checks were successful
Push Workflows / docs (push) Successful in 50s
Push Workflows / leptos-test (push) Successful in 1m29s
Push Workflows / test (push) Successful in 1m47s
Push Workflows / nix-build (push) Successful in 21m37s
Push Workflows / docker-build (push) Successful in 22m2s
Push Workflows / build (push) Successful in 6m53s
2025-02-02 15:47:25 -05:00
0739b0026b Update cargo-leptos to 0.2.26
Some checks failed
Push Workflows / docs (push) Successful in 1m46s
Push Workflows / test (push) Successful in 2m25s
Push Workflows / leptos-test (push) Successful in 3m30s
Push Workflows / docker-build (push) Failing after 4m58s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / build (push) Successful in 7m5s
Update wasm-bindgen to 0.2.100
2025-02-02 15:37:19 -05:00
698931d915 Update to leptos 0.7.5
Some checks failed
Push Workflows / docs (push) Successful in 1m47s
Push Workflows / leptos-test (push) Successful in 4m0s
Push Workflows / test (push) Successful in 4m41s
Push Workflows / build (push) Failing after 6m40s
Push Workflows / docker-build (push) Failing after 13m6s
Push Workflows / nix-build (push) Successful in 20m4s
2025-02-02 15:14:46 -05:00
38bc2fbe92 Use effect to set audio source when PlayStatus changes
All checks were successful
Push Workflows / docs (push) Successful in 39s
Push Workflows / leptos-test (push) Successful in 1m7s
Push Workflows / build (push) Successful in 2m7s
Push Workflows / test (push) Successful in 2m48s
Push Workflows / nix-build (push) Successful in 17m57s
Push Workflows / docker-build (push) Successful in 3m31s
2025-01-07 15:13:07 -05:00
d3e9c5d869 Fix broken get_audio test
Some checks failed
Push Workflows / docs (push) Successful in 48s
Push Workflows / test (push) Successful in 1m45s
Push Workflows / build (push) Failing after 2m31s
Push Workflows / leptos-test (push) Successful in 3m51s
Push Workflows / docker-build (push) Successful in 8m0s
Push Workflows / nix-build (push) Successful in 21m7s
2024-12-28 16:37:09 -05:00
64c37dc327 Add underscores to fix (incorrect?) unused warming
Some checks failed
Push Workflows / docs (push) Successful in 43s
Push Workflows / test (push) Successful in 1m34s
Push Workflows / build (push) Failing after 2m1s
Push Workflows / leptos-test (push) Failing after 3m12s
Push Workflows / docker-build (push) Successful in 8m0s
Push Workflows / nix-build (push) Successful in 20m41s
2024-12-28 16:11:39 -05:00
abd0f87d41 Use Signal instead of MaybeSignal 2024-12-28 16:09:53 -05:00
ec1c57a67d Use Memo::new instead of create_memo 2024-12-28 16:06:52 -05:00
262f3634bf Use NodeRef::new instead of create_node_ref 2024-12-28 16:05:27 -05:00
e533132273 Use Effect::new instead of create_effect 2024-12-28 16:02:27 -05:00
d89d9d3548 Use signal instead of create_signal 2024-12-28 16:01:32 -05:00
2cfd698978 Remove unused imports 2024-12-28 16:00:18 -05:00
57406b5940 Use RwSignal::new instead of create_rw_signal 2024-12-28 15:47:06 -05:00
628684a259 Use dialog instead of div for upload
Some checks failed
Push Workflows / build (push) Failing after 1m55s
Push Workflows / docs (push) Successful in 2m14s
Push Workflows / leptos-test (push) Failing after 3m39s
Push Workflows / test (push) Successful in 4m2s
Push Workflows / docker-build (push) Successful in 18m33s
Push Workflows / nix-build (push) Successful in 20m7s
2024-12-28 15:40:29 -05:00
96835e684a Use node_ref instead of _ref in DashboardRow 2024-12-28 15:38:31 -05:00
aa9e26459f Remove .await for loading config 2024-12-28 15:38:09 -05:00
69b3066a3b Add HTML boilerplate in shell 2024-12-28 15:37:49 -05:00
3368d16c96 Use .get() on TextProp 2024-12-28 15:24:07 -05:00
141034eacd Remove type prop from <audio> 2024-12-28 12:36:42 -05:00
55521fd7fe Remove align prop from playbar divs 2024-12-28 12:36:21 -05:00
40d6440d99 Use hydrate_body instead of mount_to_body 2024-12-28 12:28:29 -05:00
daf8a50863 Use new HTML types for getting audio component 2024-12-28 12:28:17 -05:00
099c1042a2 Use raw strings instead of TextProp for classes in SongList 2024-12-28 12:27:03 -05:00
b3748374d4 Fix type of setting logged in user resource 2024-12-28 12:26:25 -05:00
5235854af7 Use tbody for table in SongList 2024-12-28 12:24:37 -05:00
915d5ea6f7 Remove options arg from render_app_to_stream call 2024-12-28 12:20:23 -05:00
f5c863f2a6 Merge pull request 'Turn DashboardRow and DashboardTile into components' (#194) from 193-run-dashboardrow-and-dashboardtile-into-components into main
Some checks failed
Push Workflows / docs (push) Successful in 3m6s
Push Workflows / test (push) Successful in 3m51s
Push Workflows / leptos-test (push) Successful in 4m52s
Push Workflows / build (push) Successful in 7m16s
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Reviewed-on: #194
2024-12-26 17:30:39 +00:00
ec01183dc2 Use Arc for response handler instead of Rc 2024-12-24 16:54:07 -05:00
3dd040afd0 Merge remote-tracking branch 'origin/193-run-dashboardrow-and-dashboardtile-into-components' into 171-update-to-leptos-07 2024-12-24 16:50:45 -05:00
c900cb896e Use new DashboardRow / DashboardTile in artist and profile pages
All checks were successful
Push Workflows / test (push) Successful in 47s
Push Workflows / docs (push) Successful in 1m8s
Push Workflows / leptos-test (push) Successful in 1m12s
Push Workflows / build (push) Successful in 1m37s
Push Workflows / docker-build (push) Successful in 4m49s
Push Workflows / nix-build (push) Successful in 17m1s
2024-12-24 16:48:49 -05:00
2af8310077 Implement Into<DashboardTile> instead of implementing old trait DashboardTile 2024-12-24 16:48:48 -05:00
1a4112542e Convert DashboardRow to component 2024-12-24 16:48:48 -05:00
40bf99a2bf Use spread syntax for Form class 2024-12-24 15:09:58 -05:00
ebc669ecf8 Use new Router setup 2024-12-23 21:58:34 -05:00
b4664bdad7 Fix bad path for mount_to_body 2024-12-23 21:51:31 -05:00
608f18ace5 Manually import Params and use_params 2024-12-23 21:49:34 -05:00
20ff4674d4 Fix path to use_navigate 2024-12-23 21:49:34 -05:00
3de5efc27f Specify leptos::ev instead of ev 2024-12-23 21:49:34 -05:00
b9f5867b4d Fix Resource type signature 2024-12-23 21:49:34 -05:00
db8dc3cd3d Manually import TextProp 2024-12-23 21:49:33 -05:00
848b1afd2c Use node_ref instead of _ref 2024-12-23 21:33:07 -05:00
141a27bb7e Fix bad import path for use_location 2024-12-23 21:25:23 -05:00
78d59731b0 Manually import spawn_local 2024-12-23 21:24:26 -05:00
26a572b18a Fix bad import path for Form 2024-12-23 21:17:46 -05:00
f6ee5feb3f Update leptos-use to 0.15 2024-12-23 21:15:31 -05:00
0cd36d4b44 Remove duplicate "required" prop 2024-12-23 21:11:57 -05:00
3c148c36df Update server_fn to 0.7 2024-12-23 21:10:21 -05:00
4eb673a9a4 Fix bad import path for NodeRef 2024-12-23 21:06:28 -05:00
782c9b9482 Use correct use_params_map import path 2024-12-23 21:04:20 -05:00
52d60318bb Render String instead of &String from error_msg 2024-12-23 21:01:39 -05:00
7732b77eb5 Use leptos::either to handle mismatched return types instead of into_view() 2024-12-23 20:58:53 -05:00
fe131b1ba2 Use spread syntax for Icon class 2024-12-23 20:47:41 -05:00
064f06d763 Update icons 2024-12-23 20:47:28 -05:00
900d1ca1bb Use new way of creating resources 2024-12-23 20:34:28 -05:00
92eb63e946 Use new leptos::predude module 2024-12-23 20:33:42 -05:00
a9c1ed7048 Upgrade to wasm-bindgen 0.2.99 2024-12-23 20:32:29 -05:00
a63b5d4e29 Remove hydrate flag from leptos_router 2024-12-23 20:31:31 -05:00
238a24c938 Remove nightly and hydrate flags from leptos_meta 2024-12-23 20:01:50 -05:00
69125f71f3 Update leptos_ crates to 0.7 2024-12-23 20:00:37 -05:00
ae8a3d0ade Merge pull request 'Remove old cicd utils' (#192) from 191-remove-old-cicd-utils into main
Some checks failed
Push Workflows / docker-build (push) Successful in 51s
Push Workflows / leptos-test (push) Successful in 57s
Push Workflows / docs (push) Successful in 1m0s
Push Workflows / build (push) Successful in 1m18s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Reviewed-on: #192
2024-12-20 20:05:24 +00:00
343284a6da Remove cicd/
All checks were successful
Push Workflows / docs (push) Successful in 39s
Push Workflows / test (push) Successful in 46s
Push Workflows / leptos-test (push) Successful in 1m6s
Push Workflows / docker-build (push) Successful in 1m6s
Push Workflows / build (push) Successful in 1m10s
Push Workflows / nix-build (push) Successful in 29m5s
2024-12-20 15:00:12 -05:00
65e5de7051 Merge pull request 'Display like/dislike for client instead of viewed user on profile page' (#189) from 140-display-likedislike-for-client-instead-of into main
Some checks failed
Push Workflows / build (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / docs (push) Has been cancelled
Reviewed-on: #189
2024-12-20 19:37:38 +00:00
219a218f92 Merge pull request 'Fix hardcoded for_user_id in artist page' (#190) from 185-fix-hardcoded-foruserid-in-artist-page into main
Some checks failed
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / docs (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
Reviewed-on: #190
2024-12-20 19:37:28 +00:00
f8534cd6f6 Return like/dislike data for user viewing page
All checks were successful
Push Workflows / leptos-test (push) Successful in 1m27s
Push Workflows / build (push) Successful in 1m41s
Push Workflows / test (push) Successful in 1m44s
Push Workflows / docs (push) Successful in 1m43s
Push Workflows / docker-build (push) Successful in 5m25s
Push Workflows / nix-build (push) Successful in 17m45s
2024-12-20 14:23:06 -05:00
01e393a77f Return like/dislike data for user viewing page
All checks were successful
Push Workflows / docs (push) Successful in 35s
Push Workflows / test (push) Successful in 45s
Push Workflows / leptos-test (push) Successful in 1m1s
Push Workflows / build (push) Successful in 2m52s
Push Workflows / docker-build (push) Successful in 5m52s
Push Workflows / nix-build (push) Successful in 19m17s
2024-12-20 14:21:22 -05:00
481d9109eb Merge pull request 'Fix docker-build caching' (#188) from 186-fix-dockerbuild-caching into main
All checks were successful
Push Workflows / nix-build (push) Successful in 20m7s
Push Workflows / docs (push) Successful in 39s
Push Workflows / test (push) Successful in 44s
Push Workflows / leptos-test (push) Successful in 1m3s
Push Workflows / build (push) Successful in 1m46s
Push Workflows / docker-build (push) Successful in 1m29s
Reviewed-on: #188
2024-12-20 19:09:41 +00:00
050cab6d46 Use GitHub Actions cache
All checks were successful
Push Workflows / test (push) Successful in 59s
Push Workflows / docs (push) Successful in 1m0s
Push Workflows / build (push) Successful in 1m17s
Push Workflows / leptos-test (push) Successful in 1m42s
Push Workflows / nix-build (push) Successful in 17m42s
Push Workflows / docker-build (push) Successful in 21m34s
2024-12-20 13:46:36 -05:00
87f5efed34 Merge pull request 'Create song page' (#187) from 144-create-song-page-2 into main
All checks were successful
Push Workflows / test (push) Successful in 46s
Push Workflows / leptos-test (push) Successful in 56s
Push Workflows / docs (push) Successful in 1m2s
Push Workflows / build (push) Successful in 1m27s
Push Workflows / nix-build (push) Successful in 16m22s
Push Workflows / docker-build (push) Successful in 15m40s
Reviewed-on: #187
2024-12-20 18:39:26 +00:00
525be5615c Add CSS for song page
All checks were successful
Push Workflows / test (push) Successful in 43s
Push Workflows / leptos-test (push) Successful in 1m0s
Push Workflows / docs (push) Successful in 2m7s
Push Workflows / build (push) Successful in 5m25s
Push Workflows / docker-build (push) Successful in 12m4s
Push Workflows / nix-build (push) Successful in 20m17s
2024-12-20 13:26:05 -05:00
28b71df7e6 Finish song page 2024-12-20 13:25:51 -05:00
560fe0355d Make some parts of SongList public 2024-12-20 13:25:39 -05:00
0e64131fa0 Add route for song page 2024-12-20 13:25:23 -05:00
f3f123d8f6 Add module for song page 2024-12-20 13:25:15 -05:00
15087e86b5 Add API endpoints for song page 2024-12-20 13:24:44 -05:00
c77699b3a1 Merge remote-tracking branch 'origin/main' into 144-create-song-page-2 2024-12-19 19:20:59 -05:00
e55f5d973e Merge pull request 'Create artist page' (#184) from 115-create-artist-page into main
All checks were successful
Push Workflows / docs (push) Successful in 40s
Push Workflows / test (push) Successful in 46s
Push Workflows / leptos-test (push) Successful in 50s
Push Workflows / docker-build (push) Successful in 53s
Push Workflows / build (push) Successful in 1m20s
Push Workflows / nix-build (push) Successful in 16m27s
Reviewed-on: #184
2024-12-20 00:12:42 +00:00
3586df650f Fix unused import warnings
Some checks failed
Push Workflows / docs (push) Successful in 47s
Push Workflows / leptos-test (push) Successful in 53s
Push Workflows / test (push) Successful in 54s
Push Workflows / build (push) Successful in 1m16s
Push Workflows / nix-build (push) Successful in 17m40s
Push Workflows / docker-build (push) Has been cancelled
2024-12-19 18:53:10 -05:00
579e764994 Finish artist page
Some checks failed
Push Workflows / test (push) Successful in 43s
Push Workflows / build (push) Failing after 50s
Push Workflows / docs (push) Successful in 1m26s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
2024-12-19 18:48:38 -05:00
fb86e2e229 Add API endpoints for artist page 2024-12-19 18:48:21 -05:00
0ff594aaec Increase artist image padding, remove margin 2024-12-19 18:48:05 -05:00
6cc5f60c5a Increase artist image size 2024-12-19 18:47:52 -05:00
ce9e16f376 Add route for artist page 2024-12-19 18:44:43 -05:00
e915e1ab44 Add module for artist page 2024-12-19 18:44:28 -05:00
7e7480d02b Add artist page CSS to main 2024-12-19 18:43:46 -05:00
dcdfee27a3 Merge remote-tracking branch 'origin/main' into 115-create-artist-page 2024-12-19 14:18:18 -05:00
8061bb9f5e Merge pull request 'Add caching to Rust CICD jobs' (#183) from 182-add-caching-to-rust-cicd-jobs into main
All checks were successful
Push Workflows / nix-build (push) Successful in 30m29s
Push Workflows / test (push) Successful in 36s
Push Workflows / docs (push) Successful in 36s
Push Workflows / leptos-test (push) Successful in 54s
Push Workflows / build (push) Successful in 6m41s
Push Workflows / docker-build (push) Successful in 11m38s
Reviewed-on: #183
2024-12-19 18:14:51 +00:00
cff1327b8a Use rust-cache action for build, test, docs, and leptos-test jobs
All checks were successful
Push Workflows / docker-build (push) Successful in 39s
Push Workflows / test (push) Successful in 1m5s
Push Workflows / docs (push) Successful in 1m7s
Push Workflows / leptos-test (push) Successful in 1m28s
Push Workflows / build (push) Successful in 1m43s
Push Workflows / nix-build (push) Successful in 16m54s
2024-12-19 11:19:39 -05:00
a5a679a74e Merge pull request 'Resolve "implement artist/album creation"' (#76) from 45-implement-artist-album-creation into main
Some checks failed
Push Workflows / test (push) Has been cancelled
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Push Workflows / docker-build (push) Successful in 30s
Push Workflows / docs (push) Successful in 58s
Push Workflows / leptos-test (push) Has been cancelled
Reviewed-on: #76
Reviewed-by: Ethan Girouard <ethan@girouard.com>
2024-12-19 05:33:45 +00:00
6042ec209c Switch to chrono instead of time
All checks were successful
Push Workflows / nix-build (push) Successful in 19m24s
Push Workflows / docker-build (push) Successful in 12m48s
Push Workflows / docs (push) Successful in 1m17s
Push Workflows / test (push) Successful in 2m13s
Push Workflows / leptos-test (push) Successful in 2m50s
Push Workflows / build (push) Successful in 4m55s
2024-12-19 00:13:52 -05:00
08a2322eb8 Merge remote-tracking branch 'origin/main' into 45-implement-artist-album-creation 2024-12-19 00:07:19 -05:00
36ffb33b02 Merge pull request 'Use different CICD image for docker-build job' (#179) from 178-use-different-cicd-image-for-dockerbuild into main
All checks were successful
Push Workflows / docs (push) Successful in 1m37s
Push Workflows / test (push) Successful in 2m33s
Push Workflows / leptos-test (push) Successful in 5m44s
Push Workflows / build (push) Successful in 6m1s
Push Workflows / docker-build (push) Successful in 12m53s
Push Workflows / nix-build (push) Successful in 19m30s
Reviewed-on: #179
2024-12-19 03:15:34 +00:00
42beaad659 Use ubuntu-latest-docker for docker-build job
All checks were successful
Push Workflows / docs (push) Successful in 1m55s
Push Workflows / test (push) Successful in 3m3s
Push Workflows / leptos-test (push) Successful in 3m50s
Push Workflows / build (push) Successful in 6m28s
Push Workflows / docker-build (push) Successful in 13m32s
Push Workflows / nix-build (push) Successful in 20m0s
2024-12-18 21:53:09 -05:00
bef240e2b2 Merge pull request 'Pin cargo-leptos version in Dockerfile' (#176) from 175-pin-cargoleptos-version-in-dockerfile into main
Some checks failed
Push Workflows / test (push) Waiting to run
Push Workflows / docker-build (push) Failing after 2m23s
Push Workflows / leptos-test (push) Successful in 2m52s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docs (push) Has been cancelled
Push Workflows / build (push) Has been cancelled
Reviewed-on: #176
2024-12-18 06:52:37 +00:00
05074230a9 Pin cargo-leptos to 0.2.22 in Dockerfile
All checks were successful
Push Workflows / build (push) Successful in 14m41s
Push Workflows / test (push) Successful in 9m46s
Push Workflows / docs (push) Successful in 3m54s
Push Workflows / leptos-test (push) Successful in 10m45s
Push Workflows / docker-build (push) Successful in 27m57s
Push Workflows / nix-build (push) Successful in 25m33s
2024-12-18 00:36:19 -05:00
ecc5b35cd0 Merge pull request 'Add NixOS environment file' (#174) from 170-add-nixos-environment-file into main
Some checks failed
Push Workflows / docs (push) Successful in 1m30s
Push Workflows / docker-build (push) Failing after 1m30s
Push Workflows / test (push) Successful in 3m52s
Push Workflows / leptos-test (push) Successful in 6m31s
Push Workflows / build (push) Successful in 7m2s
Push Workflows / nix-build (push) Successful in 26m17s
Reviewed-on: #174
2024-12-18 05:35:17 +00:00
3536ad7343 Build flake from git URL
Some checks failed
Push Workflows / docs (push) Successful in 5m18s
Push Workflows / test (push) Successful in 9m29s
Push Workflows / leptos-test (push) Successful in 10m34s
Push Workflows / build (push) Successful in 11m24s
Push Workflows / docker-build (push) Failing after 18m17s
Push Workflows / nix-build (push) Successful in 29m38s
2024-12-17 23:44:56 -05:00
f1c94bd8a8 Update cargo-leptos to 0.2.22 in flake
Some checks failed
Push Workflows / docs (push) Successful in 6m41s
Push Workflows / test (push) Successful in 9m23s
Push Workflows / leptos-test (push) Successful in 11m18s
Push Workflows / build (push) Successful in 12m26s
Push Workflows / docker-build (push) Failing after 16m4s
Push Workflows / nix-build (push) Successful in 31m15s
2024-12-17 22:21:50 -05:00
59e97c4a79 Merge branch '172-fix-unexpected-cfg-condition-name-wasmbindgenunstabletestcoverage' into 170-add-nixos-environment-file
Some checks failed
Push Workflows / docs (push) Successful in 3m53s
Push Workflows / test (push) Successful in 6m31s
Push Workflows / leptos-test (push) Successful in 9m48s
Push Workflows / build (push) Successful in 10m15s
Push Workflows / docker-build (push) Failing after 17m18s
Push Workflows / nix-build (push) Failing after 26m27s
2024-12-17 21:46:47 -05:00
96e6b67c6e Add Nix build CICD job
Some checks failed
Push Workflows / docs (push) Successful in 2m39s
Push Workflows / build (push) Failing after 3m25s
Push Workflows / test (push) Successful in 5m22s
Push Workflows / leptos-test (push) Successful in 6m49s
Push Workflows / nix-build (push) Has been cancelled
Push Workflows / docker-build (push) Has been cancelled
2024-12-17 21:42:01 -05:00
5548992c57 Ignore some Nix files 2024-12-17 21:41:18 -05:00
414f507ef9 Merge pull request 'Add added_date column to songs table #100' (#118) from 100-add-addeddate-column-to-songs-table into main
Some checks failed
Push Workflows / docs (push) Successful in 2m9s
Push Workflows / test (push) Successful in 5m12s
Push Workflows / leptos-test (push) Successful in 6m59s
Push Workflows / build (push) Successful in 7m56s
Push Workflows / docker-build (push) Failing after 13m27s
Reviewed-on: #118
2024-12-15 23:57:20 +00:00
ec65d099f1 Merge branch '172-fix-unexpected-cfg-condition-name-wasmbindgenunstabletestcoverage' into 100-add-addeddate-column-to-songs-table
Some checks failed
Push Workflows / docs (push) Successful in 2m49s
Push Workflows / test (push) Successful in 7m19s
Push Workflows / leptos-test (push) Successful in 8m53s
Push Workflows / build (push) Successful in 9m36s
Push Workflows / docker-build (push) Failing after 15m33s
2024-12-15 18:07:59 -05:00
65aa296493 Merge pull request 'Fix unexpected cfg condition name wasm_bindgen_unstable_test_coverage warning' (#173) from 172-fix-unexpected-cfg-condition-name-wasmbindgenunstabletestcoverage into main
Some checks failed
Push Workflows / docs (push) Successful in 2m37s
Push Workflows / test (push) Successful in 5m54s
Push Workflows / leptos-test (push) Successful in 7m8s
Push Workflows / build (push) Successful in 8m34s
Push Workflows / docker-build (push) Failing after 14m6s
Reviewed-on: #173
2024-12-15 22:55:09 +00:00
d42ae8a227 Update wasm-bindgen to 0.2.96
Some checks failed
Push Workflows / docs (push) Successful in 3m47s
Push Workflows / test (push) Successful in 7m30s
Push Workflows / leptos-test (push) Successful in 8m42s
Push Workflows / build (push) Successful in 9m36s
Push Workflows / docker-build (push) Failing after 14m43s
2024-12-15 17:34:47 -05:00
b7b6406c2d Add added_date field to Song and SongData
Some checks failed
Push Workflows / docs (push) Successful in 2m8s
Push Workflows / build (push) Failing after 2m53s
Push Workflows / test (push) Successful in 5m7s
Push Workflows / leptos-test (push) Successful in 6m16s
Push Workflows / docker-build (push) Failing after 14m38s
2024-12-15 17:20:18 -05:00
9f39c9b3fd Merge branch 'main' into 100-add-addeddate-column-to-songs-table 2024-12-15 17:09:44 -05:00
8fbc733b6b Rename add_songs_added_date migration 2024-12-15 17:08:31 -05:00
3ec25881b9 Add Nix flake 2024-12-15 14:42:48 -05:00
5cb0f4a17b Add wasm32-unknown-unknown target to toolchain file 2024-12-15 14:42:32 -05:00
5967918642 Added css for songpage components
Some checks failed
Push Workflows / docs (push) Successful in 4m48s
Push Workflows / build (push) Failing after 5m49s
Push Workflows / test (push) Successful in 7m54s
Push Workflows / leptos-test (push) Successful in 11m48s
Push Workflows / docker-build (push) Failing after 32m19s
2024-12-11 04:38:08 +00:00
84371bb586 Added song overview component for the song's metadata 2024-12-11 04:37:51 +00:00
186821d838 Added songdetails component 2024-12-11 04:37:28 +00:00
4c46f78135 Added songpage component with basic structure and css file 2024-12-11 04:37:11 +00:00
9350c74091 Added mock api functions, need to implement later
Some checks failed
Push Workflows / docs (push) Successful in 3m8s
Push Workflows / build (push) Failing after 3m45s
Push Workflows / test (push) Successful in 4m57s
Push Workflows / leptos-test (push) Successful in 6m33s
Push Workflows / docker-build (push) Failing after 18m25s
2024-12-11 03:47:03 +00:00
88cd5544fd Added basic artist css 2024-12-11 03:46:07 +00:00
94880ead7c Added related artists component 2024-12-11 03:45:35 +00:00
837dd5ea3c Added top songs component that shows the top songs by an artist 2024-12-11 03:45:15 +00:00
86e5e733b3 Add artist detail component for name/bio/image 2024-12-11 03:44:43 +00:00
8dbaaf317d Added ArtistProfile component to get artist info based on id 2024-12-11 03:44:13 +00:00
d4897b4227 Added basic artist page
Some checks failed
Push Workflows / docs (push) Successful in 2m38s
Push Workflows / build (push) Failing after 3m32s
Push Workflows / docker-build (push) Has been cancelled
Push Workflows / test (push) Has been cancelled
Push Workflows / leptos-test (push) Has been cancelled
2024-12-11 03:42:32 +00:00
53805d8793 Add added_date column to songs table #100
All checks were successful
Push Workflows / docs (push) Successful in 2m46s
Push Workflows / test (push) Successful in 5m8s
Push Workflows / leptos-test (push) Successful in 6m49s
Push Workflows / build (push) Successful in 7m46s
Push Workflows / docker-build (push) Successful in 15m7s
2024-10-24 03:17:06 +00:00
6112e0dfac fixed input bug 2024-05-29 10:54:13 -04:00
Ethan Girouard
af604a9ddc Fix doc comment indentation 2024-05-26 20:44:40 -04:00
bcb24c2a97 added background overlay when adding anything 2024-05-23 00:02:13 -04:00
6676f2c533 upload dropdown closes after selecting what upload 2024-05-22 23:17:52 -04:00
3746c370a2 completed add album component, front and backend 2024-05-22 23:04:21 -04:00
64e93649af Completed Adding Artist component, front and back 2024-05-22 20:24:30 -04:00
fcc5870824 create artist 2024-05-21 22:39:56 -04:00
3ce762ce5b create artist ui created 2024-05-21 22:39:48 -04:00
1ecd13d65f dropdown component complete 2024-05-21 12:41:44 -04:00
be775862f9 created dropdown component 2024-05-21 11:50:41 -04:00
159 changed files with 10626 additions and 7128 deletions

View File

@@ -2,10 +2,12 @@
*
# Except:
!/assets
!/public
!/migrations
!/src
!/style
!/Cargo.lock
!/Cargo.toml
!/ascii_art.txt
!/docs
!/book.toml

View File

@@ -7,13 +7,15 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Use Cache
uses: Swatinem/rust-cache@v2
- name: Build project
env:
RUSTFLAGS: "-D warnings"
run: cargo-leptos build
docker-build:
runs-on: ubuntu-latest
runs-on: ubuntu-latest-docker
steps:
- name: Checkout repository
uses: actions/checkout@v4
@@ -34,22 +36,24 @@ jobs:
with:
push: true
tags: "${{ steps.get-image-name.outputs.IMAGE_NAME }}:${{ gitea.sha }}"
cache-from: type=registry,ref=${{ steps.get-image-name.outputs.IMAGE_NAME }}:${{ gitea.sha }}
cache-to: type=inline
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push Docker image with "latest" tag
uses: docker/build-push-action@v5
if: gitea.ref == 'refs/heads/main'
with:
push: true
tags: "${{ steps.get-image-name.outputs.IMAGE_NAME }}:latest"
cache-from: type=registry,ref=${{ steps.get-image-name.outputs.IMAGE_NAME }}:latest
cache-to: type=inline
cache-from: type=gha
cache-to: type=gha,mode=max
test:
runs-on: libretunes-cicd
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Use Cache
uses: Swatinem/rust-cache@v2
- name: Test project
run: cargo test --all-targets --all-features
@@ -58,6 +62,8 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Use Cache
uses: Swatinem/rust-cache@v2
- name: Run Leptos tests
run: cargo-leptos test
@@ -66,6 +72,8 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Use Cache
uses: Swatinem/rust-cache@v2
- name: Generate docs
run: cargo doc --no-deps
- name: Upload docs
@@ -73,3 +81,81 @@ jobs:
with:
name: docs
path: target/doc
nix-build:
runs-on: ubuntu-latest
steps:
- name: Update Package Lists
run: apt update
- name: Install Nix
run: apt install -y nix-bin
- name: Build project with Nix
run: nix build --experimental-features 'nix-command flakes' git+$GITHUB_SERVER_URL/$GITHUB_REPOSITORY.git?ref=$GITHUB_REF_NAME#default
clippy:
runs-on: libretunes-cicd
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Use Cache
uses: Swatinem/rust-cache@v2
- name: Run clippy
env:
RUSTFLAGS: "-D warnings"
run: cargo clippy --all-targets --all-features
rustfmt:
runs-on: libretunes-cicd
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Run rustfmt
run: cargo fmt --check
mdbook:
runs-on: libretunes-cicd
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Generate mdbook
run: mdbook build
- name: Upload mdbook
uses: actions/upload-artifact@v3
with:
name: mdbook
path: book
mdbook-server:
runs-on: ubuntu-latest-docker
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Gitea container registry
uses: docker/login-action@v3
with:
registry: ${{ env.registry }}
username: ${{ env.actions_user }}
password: ${{ secrets.CONTAINER_REGISTRY_TOKEN }}
- name: Get Image Name
id: get-image-name
run: |
echo "IMAGE_NAME=$(echo ${{ env.registry }}/${{ gitea.repository }}-mdbook | tr '[:upper:]' '[:lower:]')" >> $GITHUB_OUTPUT
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
file: Dockerfile.mdbook
push: true
tags: "${{ steps.get-image-name.outputs.IMAGE_NAME }}:${{ gitea.sha }}"
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push Docker image with "latest" tag
uses: docker/build-push-action@v5
if: gitea.ref == 'refs/heads/main'
with:
file: Dockerfile.mdbook
push: true
tags: "${{ steps.get-image-name.outputs.IMAGE_NAME }}:latest"
cache-from: type=gha
cache-to: type=gha,mode=max

16
.gitignore vendored
View File

@@ -12,6 +12,9 @@ test-results/
end2end/playwright-report/
playwright/.cache/
# Default asset serve directory
assets/
# Audio files
*.mp3
*.wav
@@ -31,3 +34,16 @@ playwright/.cache/
# Sass cache
.sass-cache
# Nix-related files
.direnv/
result
# Old TailwindCSS config
style/tailwind.config.js
# mdbook output
book
# Diesel lockfile
migrations/.diesel_lock

2643
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,53 +1,76 @@
[package]
name = "libretunes"
version = "0.1.0"
edition = "2021"
edition = "2024"
build = "src/build.rs"
[profile.dev]
opt-level = 0
debug = 1
incremental = true
[profile.dev.package."*"]
opt-level = 3
debug = 2
[profile.dev.build-override]
opt-level = 3
[lib]
crate-type = ["cdylib", "rlib"]
[[bin]]
name = "health_check"
path = "src/health.rs"
required-features = ["health_check"]
[dependencies]
console_error_panic_hook = { version = "0.1", optional = true }
cfg-if = "1"
http = { version = "1.0", default-features = false }
leptos = { version = "0.6", default-features = false, features = ["nightly"] }
leptos_meta = { version = "0.6", features = ["nightly"] }
leptos_axum = { version = "0.6", optional = true }
leptos_router = { version = "0.6", features = ["nightly"] }
wasm-bindgen = { version = "=0.2.95", default-features = false, optional = true }
leptos_icons = { version = "0.3.0" }
icondata = { version = "0.3.0" }
leptos = { version = "0.8.10", default-features = false, features = ["nightly"] }
leptos_meta = { version = "0.8.5" }
leptos_axum = { version = "0.8.6", optional = true }
leptos_router = { version = "0.8.8", features = ["nightly"] }
wasm-bindgen = { version = "=0.2.100", default-features = false, optional = true }
leptos_icons = { version = "0.6.1" }
icondata = { version = "0.5.0" }
diesel = { version = "2.1.4", features = ["postgres", "r2d2", "chrono"], default-features = false, optional = true }
lazy_static = { version = "1.4.0", optional = true }
serde = { version = "1.0.195", features = ["derive"], default-features = false }
openssl = { version = "0.10.63", optional = true }
diesel_migrations = { version = "2.1.0", optional = true }
pbkdf2 = { version = "0.12.2", features = ["simple"], optional = true }
tokio = { version = "1", optional = true, features = ["rt-multi-thread"] }
axum = { version = "0.7.5", features = ["tokio", "http1"], default-features = false, optional = true }
axum = { version = "0.8.4", features = ["tokio", "http1"], default-features = false, optional = true }
tower = { version = "0.5.1", optional = true, features = ["util"] }
tower-http = { version = "0.6.1", optional = true, features = ["fs"] }
tower-http = { version = "0.6.1", optional = true, features = [
"fs", "compression-br", "compression-deflate", "compression-gzip", "compression-zstd"] }
thiserror = "1.0.57"
tower-sessions-redis-store = { version = "0.11", optional = true }
tower-sessions-redis-store = { version = "0.16", optional = true }
async-trait = { version = "0.1.79", optional = true }
axum-login = { version = "0.14.0", optional = true }
server_fn = { version = "0.6.11", features = ["multipart"] }
axum-login = { version = "0.17.0", optional = true }
server_fn = { version = "0.8.2", features = ["multipart"] }
symphonia = { version = "0.5.4", default-features = false, features = ["mp3"], optional = true }
multer = { version = "3.0.0", optional = true }
multer = { version = "3.1.0", optional = true }
log = { version = "0.4.21", optional = true }
flexi_logger = { version = "0.28.0", optional = true, default-features = false }
web-sys = "0.3.69"
leptos-use = "0.13.5"
leptos-use = "0.16.3"
image-convert = { version = "0.18.0", optional = true, default-features = false }
chrono = { version = "0.4.38", default-features = false, features = ["serde", "clock"] }
dotenvy = { version = "0.15.7", optional = true }
reqwest = { version = "0.12.9", default-features = false, optional = true }
futures = { version = "0.3.25", default-features = false, optional = true }
once_cell = { version = "1.20", default-features = false, optional = true }
libretunes_macro = { git = "https://git.libretunes.xyz/LibreTunes/LibreTunes-Macro.git", branch = "main" }
rand = { version = "0.9.1", optional = true }
clap = { version = "4.5.39", features = ["derive", "env"], optional = true }
tokio-tungstenite = { version = "0.26.2", optional = true }
audiotags = { version = "0.5.0", default-features = false, optional = true }
url = { version = "2.5.7", optional = true }
[features]
hydrate = [
"leptos/hydrate",
"leptos_meta/hydrate",
"leptos_router/hydrate",
"console_error_panic_hook",
"wasm-bindgen",
"chrono/wasmbind",
@@ -59,8 +82,6 @@ ssr = [
"leptos_router/ssr",
"dotenvy",
"diesel",
"lazy_static",
"openssl",
"diesel_migrations",
"pbkdf2",
"tokio",
@@ -76,6 +97,27 @@ ssr = [
"flexi_logger",
"leptos-use/ssr",
"image-convert",
"dep:audiotags",
"rand",
"dep:url",
"dep:clap",
]
reqwest_api = [
"reqwest",
"reqwest/cookies",
"futures",
"once_cell",
# Not needed, but fixes compile errors when building for all targets
# (which is useful for code editors checking for errors)
"server_fn/reqwest"
]
health_check = [
"reqwest_api",
"tokio",
"tokio/rt",
"tokio/macros",
"tokio-tungstenite",
]
# Defines a size-optimized profile for the WASM bundle in release mode
@@ -94,13 +136,20 @@ site-root = "target/site"
# The site-root relative folder where all compiled output (JS, WASM and CSS) is written
# Defaults to pkg
site-pkg-dir = "pkg"
# [Optional] The source CSS file. If it ends with .sass or .scss then it will be compiled by dart-sass into CSS. The CSS is optimized by Lightning CSS before being written to <site-root>/<site-pkg>/app.css
style-file = "style/main.scss"
# The tailwind input file.
#
# Optional, Activates the tailwind build
tailwind-input-file = "style/main.css"
# The tailwind config file.
#
# Optional, defaults to "tailwind.config.js" which if is not present
# is generated for you
tailwind-config-file = "style/tailwind.config.js"
# Assets source dir. All files found here will be copied and synchronized to site-root.
# The assets-dir cannot have a sub directory with the same name/path as site-pkg-dir.
#
# Optional. Env: LEPTOS_ASSETS_DIR.
assets-dir = "assets"
assets-dir = "public"
# The IP and port (ex: 127.0.0.1:3000) where the server serves the content. Use it in your server setup.
site-addr = "127.0.0.1:3000"
# The port to use for automatic reload monitoring
@@ -116,6 +165,8 @@ browserquery = "defaults"
watch = false
# The environment Leptos will run in, usually either "DEV" or "PROD"
env = "DEV"
# Specify the name of the bin target
bin-target = "libretunes"
# The features to use when compiling the bin target
#
# Optional. Can be over-ridden with the command line parameter --bin-features
@@ -140,3 +191,8 @@ lib-default-features = false
#
# Optional. Defaults to "release".
lib-profile-release = "wasm-release"
# Enables additional file hashes on outputted css, js, and wasm files
#
# Optional: Defaults to false. Can also be set with the LEPTOS_HASH_FILES=false env var (must be set at runtime too)
hash-files = true

View File

@@ -1,10 +1,11 @@
FROM rust:slim AS builder
ENV LEPTOS_TAILWIND_VERSION=v4.0.6
WORKDIR /app
RUN rustup default nightly
RUN rustup target add wasm32-unknown-unknown
RUN cargo install cargo-leptos
# Install a few dependencies
RUN set -eux; \
@@ -18,6 +19,9 @@ RUN set -eux; \
wget; \
rm -rf /var/lib/apt/lists/*
RUN wget -O - https://github.com/leptos-rs/cargo-leptos/releases/download/v0.2.42/cargo-leptos-x86_64-unknown-linux-gnu.tar.gz \
| tar xvfz - -C /bin --strip-components=1 cargo-leptos-x86_64-unknown-linux-gnu/cargo-leptos
# Install ImageMagick
RUN cd / && \
wget https://github.com/ImageMagick/ImageMagick/archive/refs/tags/7.1.1-38.tar.gz && \
@@ -33,27 +37,29 @@ RUN cd / && \
COPY Cargo.toml Cargo.lock /app/
# Create dummy files to force cargo to build the dependencies
RUN mkdir /app/src && mkdir /app/style && mkdir /app/assets && \
echo "fn main() {}" | tee /app/src/build.rs > /app/src/main.rs && \
RUN mkdir /app/src && mkdir /app/style && mkdir /app/public && \
echo "fn main() {}" | tee /app/src/build.rs | tee /app/src/main.rs > /app/src/health.rs && \
touch /app/src/lib.rs && \
touch /app/style/main.scss
touch /app/style/main.css
# Prebuild dependencies
RUN cargo-leptos build --release --precompress
RUN cargo build --bin health_check --features health_check --release
RUN rm -rf /app/src /app/style /app/assets
RUN rm -rf /app/src /app/style /app/public
COPY ascii_art.txt /app/ascii_art.txt
COPY assets /app/assets
COPY public /app/public
COPY src /app/src
COPY migrations /app/migrations
COPY style /app/style
# Touch files to force rebuild
RUN touch /app/src/main.rs && touch /app/src/lib.rs && touch /app/src/build.rs
RUN touch /app/src/main.rs && touch /app/src/lib.rs && touch /app/src/build.rs && touch /app/src/health.rs
# Actually build the binary
RUN cargo-leptos build --release --precompress
RUN cargo build --bin health_check --features health_check --release
# Use ldd to list all dependencies of /app/target/release/libretunes, then copy them to /app/libs
# Setting LD_LIBRARY_PATH is necessary to find the ImageMagick libraries
@@ -70,6 +76,9 @@ library manager built for collaborative listening."
# Copy the binary and the compressed assets to the "site root"
COPY --from=builder /app/target/release/libretunes /libretunes
COPY --from=builder /app/target/site /site
COPY --from=builder /app/target/release/health_check /health_check
HEALTHCHECK CMD [ "/health_check" ]
# Copy libraries to /lib64
COPY --from=builder /app/libs /lib64

13
Dockerfile.mdbook Normal file
View File

@@ -0,0 +1,13 @@
FROM rust:slim AS builder
WORKDIR /app
RUN cargo install mdbook
COPY book.toml /app/book.toml
COPY docs /app/docs
RUN mdbook build
FROM nginx:alpine AS webserver
COPY --from=builder /app/book /usr/share/nginx/html

4
book.toml Normal file
View File

@@ -0,0 +1,4 @@
[book]
language = "en"
src = "docs"
title = "LibreTunes Documentation"

View File

@@ -1,22 +0,0 @@
#!/bin/sh
set -e
ZONE_ID=$1
RECORD_NAME=$2
RECORD_COMMENT=$3
API_TOKEN=$4
TUNNEL_ID=$5
curl --request POST --silent \
--url https://api.cloudflare.com/client/v4/zones/$ZONE_ID/dns_records \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer $API_TOKEN" \
--data '{
"content": "'$TUNNEL_ID'.cfargotunnel.com",
"name": "'$RECORD_NAME'",
"comment": "'$RECORD_COMMENT'",
"proxied": true,
"type": "CNAME",
"ttl": 1
}' \

View File

@@ -1,19 +0,0 @@
#!/bin/sh
set -e
SERVICE=$1
HOSTNAME=$2
TUNNEL_ID=$3
echo "Creating tunnel config for $HOSTNAME"
cat <<EOF > cloudflared-tunnel-config.yml
tunnel: $TUNNEL_ID
credentials-file: /etc/cloudflared/auth.json
ingress:
- hostname: $HOSTNAME
service: $SERVICE
- service: http_status:404
EOF

View File

@@ -1,55 +0,0 @@
version: '3'
services:
cloudflare:
image: cloudflare/cloudflared:latest
command: tunnel run
volumes:
- cloudflared-config:/etc/cloudflared:ro
libretunes:
image: registry.mregirouard.com/libretunes/libretunes:${LIBRETUNES_VERSION}
environment:
REDIS_URL: redis://redis:6379
POSTGRES_HOST: postgres
POSTGRES_USER: libretunes
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: libretunes
volumes:
- libretunes-audio:/site/audio
depends_on:
- redis
- postgres
restart: always
redis:
image: redis:latest
volumes:
- libretunes-redis:/data
restart: always
healthcheck:
test: ["CMD-SHELL", "redis-cli", "ping"]
interval: 10s
timeout: 5s
retries: 5
postgres:
image: postgres:latest
environment:
POSTGRES_USER: libretunes
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: libretunes
volumes:
- libretunes-postgres:/var/lib/postgresql/data
restart: always
healthcheck:
test: ["CMD-SHELL", "pg_isready -U libretunes"]
interval: 10s
timeout: 5s
retries: 5
volumes:
cloudflared-config:
libretunes-audio:
libretunes-redis:
libretunes-postgres:

View File

@@ -1,22 +0,0 @@
#!/bin/sh
set -e
ZONE_ID=$1
RECORD_NAME=$2
RECORD_COMMENT=$3
API_TOKEN=$4
RECORD_ID=$(
curl --request GET --silent \
--url "https://api.cloudflare.com/client/v4/zones/$ZONE_ID/dns_records?name=$RECORD_NAME&comment=$RECORD_COMMENT" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $API_TOKEN" \
| jq -r '.result[0].id')
echo "Deleting DNS record ID $RECORD_ID"
curl --request DELETE --silent \
--url "https://api.cloudflare.com/client/v4/zones/$ZONE_ID/dns_records/$RECORD_ID" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer $API_TOKEN"

1
docs/SUMMARY.md Normal file
View File

@@ -0,0 +1 @@
# Summary

96
flake.lock generated Normal file
View File

@@ -0,0 +1,96 @@
{
"nodes": {
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1731533236,
"narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "11707dc2f618dd54ca8739b309ec4fc024de578b",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1760284886,
"narHash": "sha256-TK9Kr0BYBQ/1P5kAsnNQhmWWKgmZXwUQr4ZMjCzWf2c=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "cf3f5c4def3c7b5f1fc012b3d839575dbe552d43",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"nixpkgs_2": {
"locked": {
"lastModified": 1744536153,
"narHash": "sha256-awS2zRgF4uTwrOKwwiJcByDzDOdo3Q1rPZbiHQg/N38=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "18dd725c29603f582cf1900e0d25f9f1063dbf11",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixpkgs-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs",
"rust-overlay": "rust-overlay"
}
},
"rust-overlay": {
"inputs": {
"nixpkgs": "nixpkgs_2"
},
"locked": {
"lastModified": 1760409263,
"narHash": "sha256-GvcdHmY3nZnU6GnUkEG1a7pDZPgFcuN+zGv3OgvfPH0=",
"owner": "oxalica",
"repo": "rust-overlay",
"rev": "5694018463c2134e2369996b38deed41b1b9afc1",
"type": "github"
},
"original": {
"owner": "oxalica",
"repo": "rust-overlay",
"type": "github"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

89
flake.nix Normal file
View File

@@ -0,0 +1,89 @@
{
description = "LibreTunes build and development environment";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
rust-overlay.url = "github:oxalica/rust-overlay";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = { self, nixpkgs, rust-overlay, flake-utils, ... }:
flake-utils.lib.eachDefaultSystem (system:
let
overlays = [ (import rust-overlay) ];
pkgs = import nixpkgs {
inherit system overlays;
};
buildPkgs = with pkgs; [
(rust-bin.fromRustupToolchainFile ./rust-toolchain.toml)
cargo-leptos
clang
openssl
postgresql
imagemagick
pkg-config
tailwindcss_4
binaryen
];
in
{
devShells.default = pkgs.mkShell {
LIBCLANG_PATH = pkgs.lib.makeLibraryPath [ pkgs.llvmPackages_latest.libclang.lib ];
LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [ pkgs.libgcc.lib ];
buildInputs = with pkgs; buildPkgs ++ [
diesel-cli
mdbook
];
shellHook = ''
set -a
[[ -f .env ]] && source .env
set +a
'';
};
packages.default = pkgs.rustPlatform.buildRustPackage {
name = "libretunes";
src = ./.;
cargoLock = {
lockFile = ./Cargo.lock;
# Needed because of git dependency
outputHashes = {
"libretunes_macro-0.1.0" = "sha256-e5hcT3fCvhyaCMPHslTkLqFE03id5Vg7/SekyVn1JI4=";
};
};
LIBCLANG_PATH = pkgs.lib.makeLibraryPath [ pkgs.llvmPackages_latest.libclang.lib ];
nativeBuildInputs = with pkgs; buildPkgs ++ [
makeWrapper
];
buildInputs = with pkgs; [
openssl
imagemagick
];
buildPhase = ''
cargo-leptos build --precompress --release
'';
installPhase = ''
mkdir -p $out/bin
install -t $out target/release/libretunes
cp -r target/site $out/site
makeWrapper $out/libretunes $out/bin/libretunes \
--set LEPTOS_SITE_ROOT $out/site \
--set LD_LIBRARY_PATH ${pkgs.libgcc.lib}
'';
doCheck = false;
};
}
);
}

View File

@@ -0,0 +1,2 @@
ALTER TABLE songs
DROP COLUMN added_date;

View File

@@ -0,0 +1,2 @@
ALTER TABLE songs
ADD COLUMN added_date DATE DEFAULT CURRENT_DATE;

View File

@@ -0,0 +1,4 @@
ALTER TABLE songs
ALTER COLUMN added_date TYPE DATE USING added_date::DATE,
ALTER COLUMN added_date SET DEFAULT CURRENT_DATE,
ALTER COLUMN added_date SET NOT NULL;

View File

@@ -0,0 +1,4 @@
ALTER TABLE songs
ALTER COLUMN added_date TYPE TIMESTAMP USING added_date::TIMESTAMP,
ALTER COLUMN added_date SET DEFAULT CURRENT_TIMESTAMP,
ALTER COLUMN added_date SET NOT NULL;

View File

@@ -0,0 +1,4 @@
DROP TRIGGER IF EXISTS playlist_songs_after_insert
ON playlist_songs;
DROP FUNCTION IF EXISTS trg_update_playlists_updated_at();

View File

@@ -0,0 +1,15 @@
CREATE OR REPLACE FUNCTION trg_update_playlists_updated_at()
RETURNS TRIGGER AS $$
BEGIN
UPDATE playlists
SET updated_at = NOW()
WHERE id = NEW.playlist_id;
RETURN NEW;
END;
$$
LANGUAGE plpgsql;
CREATE TRIGGER playlist_songs_after_insert
AFTER INSERT ON playlist_songs
FOR EACH ROW
EXECUTE PROCEDURE trg_update_playlists_updated_at();

View File

@@ -0,0 +1 @@
ALTER TABLE artists DROP COLUMN image_path;

View File

@@ -0,0 +1 @@
ALTER TABLE artists ADD COLUMN image_path VARCHAR;

View File

@@ -0,0 +1 @@
ALTER TABLE users DROP COLUMN image_path;

View File

@@ -0,0 +1 @@
ALTER TABLE users ADD COLUMN image_path VARCHAR;

View File

@@ -0,0 +1 @@
ALTER TABLE playlists DROP COLUMN image_path;

View File

@@ -0,0 +1 @@
ALTER TABLE playlists ADD COLUMN image_path VARCHAR;

View File

Before

Width:  |  Height:  |  Size: 121 KiB

After

Width:  |  Height:  |  Size: 121 KiB

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

@@ -1,3 +1,4 @@
[toolchain]
channel = "nightly"
targets = ["wasm32-unknown-unknown"]

View File

@@ -1,42 +0,0 @@
use crate::models::Artist;
use crate::components::dashboard_tile::DashboardTile;
use serde::{Serialize, Deserialize};
use chrono::NaiveDate;
/// Holds information about an album
///
/// Intended to be used in the front-end
#[derive(Serialize, Deserialize, Clone)]
pub struct AlbumData {
/// Album id
pub id: i32,
/// Album title
pub title: String,
/// Album artists
pub artists: Vec<Artist>,
/// Album release date
pub release_date: Option<NaiveDate>,
/// Path to album image, relative to the root of the web server.
/// For example, `"/assets/images/Album.jpg"`
pub image_path: String,
}
impl DashboardTile for AlbumData {
fn image_path(&self) -> String {
self.image_path.clone()
}
fn title(&self) -> String {
self.title.clone()
}
fn link(&self) -> String {
format!("/album/{}", self.id)
}
fn description(&self) -> Option<String> {
Some(format!("Album • {}", Artist::display_list(&self.artists)))
}
}

View File

@@ -1,33 +1,52 @@
use leptos::*;
use crate::albumdata::AlbumData;
use crate::songdata::SongData;
use crate::prelude::*;
use cfg_if::cfg_if;
#[api_fn(endpoint = "album/get")]
pub async fn get_album(
id: i32,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::Album>> {
let album_artists: Vec<(backend::Album, Option<backend::Artist>)> = albums::table
.find(id)
.left_join(
album_artists::table
.inner_join(artists::table.on(artists::id.eq(album_artists::artist_id)))
.on(albums::id.eq(album_artists::album_id)),
)
.select((albums::all_columns, artists::all_columns.nullable()))
.load(db_conn)
.context("Error loading album from database")?;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos::server_fn::error::NoCustomError;
use crate::database::get_db_conn;
}
let mut album_artists = album_artists.into_iter();
let Some((album, first_artist)) = album_artists.next() else {
return Ok(None);
};
let mut artists = Vec::with_capacity(album_artists.len());
if let Some(artist) = first_artist {
artists.push(artist);
}
artists.extend(album_artists.filter_map(|(_, artist)| artist));
let album = frontend::Album {
id: album.id,
title: album.title,
artists,
release_date: album.release_date,
image_path: LocalPath::to_web_path_or_placeholder(album.image_path),
};
Ok(Some(album))
}
#[server(endpoint = "album/get")]
pub async fn get_album(id: i32) -> Result<AlbumData, ServerFnError> {
use crate::models::Album;
let db_con = &mut get_db_conn();
let album = Album::get_album_data(id,db_con)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting album: {}", e)))?;
Ok(album)
#[api_fn(endpoint = "album/get_songs")]
pub async fn get_songs(id: i32, db_conn: &mut PgPooledConn) -> BackendResult<Vec<i32>> {
songs::table
.filter(songs::album_id.eq(id))
.select(songs::id)
.order(songs::track.asc())
.load(db_conn)
.context("Error loading album songs from database")
}
#[server(endpoint = "album/get_songs")]
pub async fn get_songs(id: i32) -> Result<Vec<SongData>, ServerFnError> {
use crate::models::Album;
use crate::auth::get_logged_in_user;
let user = get_logged_in_user().await?;
let db_con = &mut get_db_conn();
// TODO: NEEDS SONG DATA QUERIES
let songdata = Album::get_song_data(id,user,db_con)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting song data: {}", e)))?;
Ok(songdata)
}

49
src/api/albums.rs Normal file
View File

@@ -0,0 +1,49 @@
use crate::prelude::*;
/// Add an album to the database
///
/// # Arguments
///
/// * `album_title` - The name of the artist to add
/// * `release_data` - The release date of the album (Optional)
/// * `image_path` - The path to the album's image file (Optional)
///
/// # Returns
/// * `Result<(), Box<dyn Error>>` - A empty result if successful, or an error
///
#[api_fn(endpoint = "albums/add-album")]
pub async fn add_album(
album_title: String,
release_date: Option<String>,
image_path: Option<String>,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
let parsed_release_date = match release_date {
Some(date) => match NaiveDate::parse_from_str(date.trim(), "%Y-%m-%d") {
Ok(parsed_date) => Some(parsed_date),
Err(e) => {
return Err(
InputError::InvalidInput(format!("Error parsing release date: {e}")).into(),
);
}
},
None => None,
};
let image_path_arg = image_path
.filter(|image_path| !image_path.is_empty())
.map(LocalPath::new);
let new_album = backend::NewAlbum {
title: album_title,
release_date: parsed_release_date,
image_path: image_path_arg,
};
diesel::insert_into(albums::table)
.values(&new_album)
.execute(db_conn)
.context("Error inserting new album into database")?;
Ok(())
}

130
src/api/artists.rs Normal file
View File

@@ -0,0 +1,130 @@
use crate::prelude::*;
/// Add an artist to the database
///
/// # Arguments
///
/// * `artist_name` - The name of the artist to add
///
/// # Returns
/// * `Result<(), Box<dyn Error>>` - A empty result if successful, or an error
///
#[api_fn(endpoint = "artists/add-artist")]
pub async fn add_artist(artist_name: String, db_conn: &mut PgPooledConn) -> BackendResult<()> {
let new_artist = backend::NewArtist {
name: artist_name,
image_path: None,
};
diesel::insert_into(artists::table)
.values(&new_artist)
.execute(db_conn)
.context("Error inserting new artist into database")?;
Ok(())
}
#[api_fn(endpoint = "artists/get")]
pub async fn get_artist_by_id(
artist_id: i32,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::Artist>> {
let artist = artists::table
.filter(artists::id.eq(artist_id))
.first::<backend::Artist>(db_conn)
.optional()
.context("Error loading artist from database")?;
let Some(artist) = artist else {
return Ok(None);
};
let artist = frontend::Artist {
id: artist.id,
name: artist.name,
image_path: LocalPath::to_web_path_or_placeholder(artist.image_path),
};
Ok(Some(artist))
}
#[api_fn(endpoint = "artists/top_songs")]
pub async fn top_songs_by_artist(
artist_id: i32,
limit: Option<i64>,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<(i32, i64)>> {
if let Some(limit) = limit {
song_history::table
.group_by(song_history::song_id)
.select((song_history::song_id, diesel::dsl::count(song_history::id)))
.left_join(song_artists::table.on(song_artists::song_id.eq(song_history::song_id)))
.filter(song_artists::artist_id.eq(artist_id))
.order_by(diesel::dsl::count(song_history::id).desc())
.left_join(songs::table.on(songs::id.eq(song_history::song_id)))
.limit(limit)
.load(db_conn)
} else {
song_history::table
.group_by(song_history::song_id)
.select((song_history::song_id, diesel::dsl::count(song_history::id)))
.left_join(song_artists::table.on(song_artists::song_id.eq(song_history::song_id)))
.filter(song_artists::artist_id.eq(artist_id))
.order_by(diesel::dsl::count(song_history::id).desc())
.left_join(songs::table.on(songs::id.eq(song_history::song_id)))
.load(db_conn)
}
.context("Failed to get artist top songs")
}
#[api_fn(endpoint = "artists/albums")]
pub async fn albums_by_artist(
artist_id: i32,
limit: Option<i64>,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<frontend::Album>> {
let album_ids = albums::table
.left_join(album_artists::table)
.filter(album_artists::artist_id.eq(artist_id))
.order_by(albums::release_date.desc())
.select(albums::id);
let album_ids = if let Some(limit) = limit {
album_ids.limit(limit).into_boxed()
} else {
album_ids.into_boxed()
};
let mut albums_map: HashMap<i32, frontend::Album> = HashMap::new();
let album_artists: Vec<(backend::Album, backend::Artist)> = albums::table
.filter(albums::id.eq_any(album_ids))
.inner_join(
album_artists::table
.inner_join(artists::table)
.on(albums::id.eq(album_artists::album_id)),
)
.select((albums::all_columns, artists::all_columns))
.load(db_conn)
.context("Error loading album artists from database")?;
for (album, artist) in album_artists {
if let Some(stored_album) = albums_map.get_mut(&album.id) {
stored_album.artists.push(artist);
} else {
let albumdata = frontend::Album {
id: album.id,
title: album.title,
artists: vec![artist],
release_date: album.release_date,
image_path: LocalPath::to_web_path_or_placeholder(album.image_path),
};
albums_map.insert(album.id, albumdata);
}
}
let mut albums: Vec<frontend::Album> = albums_map.into_values().collect();
albums.sort_by(|a1, a2| a2.release_date.cmp(&a1.release_date));
Ok(albums)
}

234
src/api/auth.rs Normal file
View File

@@ -0,0 +1,234 @@
use crate::prelude::*;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos_axum::extract;
use axum_login::AuthSession;
}
}
/// Create a new user and log them in
/// Takes in a `NewUser` struct, with the password in plaintext
/// Returns a Result with the error message if the user could not be created
#[api_fn(endpoint = "signup")]
pub async fn signup(new_user: backend::NewUser) -> BackendResult<()> {
// Check LIBRETUNES_DISABLE_SIGNUP env var
if std::env::var("LIBRETUNES_DISABLE_SIGNUP").is_ok_and(|v| v == "true") {
return Err(AuthError::SignupDisabled.into());
}
// Ensure the user has no id, and is not a self-proclaimed admin
let new_user = backend::NewUser {
admin: false,
..new_user
};
api::users::create_user(&new_user)
.await
.context("Error creating user")?;
let mut auth_session = extract::<AuthSession<AuthBackend>>()
.await
.context("Error extracting auth session")?;
let credentials = UserCredentials {
username_or_email: new_user.username.clone(),
password: new_user.password.clone().unwrap(),
};
match auth_session.authenticate(credentials).await {
Ok(Some(user)) => auth_session
.login(&user)
.await
.map_err(|e| AuthError::AuthError(format!("Error logging in user: {e}")).into()),
Ok(None) => Err(AuthError::InvalidCredentials.into()),
Err(e) => Err(AuthError::AuthError(format!("Error authenticating user: {e}")).into()),
}
}
/// Log a user in
/// Takes in a username or email and a password in plaintext
/// Returns a Result with a boolean indicating if the login was successful
#[api_fn(endpoint = "login")]
pub async fn login(
credentials: UserCredentials,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::User>> {
let mut auth_session = extract::<AuthSession<AuthBackend>>()
.await
.context("Error extracting auth session")?;
let user = api::users::validate_user(credentials, db_conn)
.await
.context("Error validating user credentials")?;
if let Some(user) = user {
auth_session
.login(&user)
.await
.map_err(|e| AuthError::AuthError(format!("Error logging in user: {e}")))?;
Ok(Some(user.into()))
} else {
Ok(None)
}
}
/// Log a user out
/// Returns a Result with the error message if the user could not be logged out
#[api_fn(endpoint = "logout")]
pub async fn logout() -> BackendResult<()> {
let mut auth_session = extract::<AuthSession<AuthBackend>>()
.await
.context("Error extracting auth session")?;
auth_session
.logout()
.await
.map_err(|e| AuthError::AuthError(format!("Error logging out user: {e}")))?;
leptos_axum::redirect("/login");
Ok(())
}
/// Check if a user is logged in
/// Returns a Result with a boolean indicating if the user is logged in
#[api_fn(endpoint = "check_auth")]
pub async fn check_auth(user: Option<backend::User>) -> BackendResult<bool> {
Ok(user.is_some())
}
/// Require that a user is logged in
/// Returns a Result with the error message if the user is not logged in
/// Intended to be used at the start of a protected route, to ensure the user is logged in
#[cfg(feature = "ssr")]
pub async fn require_auth() -> BackendResult<()> {
check_auth()
.await
.context("Error checking authentication")
.and_then(|logged_in| {
if logged_in {
Ok(())
} else {
Err(AuthError::Unauthorized.into())
}
})
}
#[cfg(feature = "ssr")]
pub async fn get_backend_user() -> BackendResult<backend::User> {
<backend::User as LoggedInUser>::get().await
}
/// Get the current logged-in user
/// Returns a Result with the user if they are logged in
/// Returns an error if the user is not logged in, or if there is an error getting the user
/// Intended to be used in a route to get the current user
#[cfg(feature = "ssr")]
pub async fn get_user() -> BackendResult<frontend::User> {
<frontend::User as LoggedInUser>::get().await
}
#[api_fn(endpoint = "get_logged_in_user")]
pub async fn get_logged_in_user(
user: Option<frontend::User>,
) -> BackendResult<Option<frontend::User>> {
Ok(user)
}
/// Check if a user is an admin
/// Returns a Result with a boolean indicating if the user is logged in and an admin
#[api_fn(endpoint = "check_admin")]
pub async fn check_admin(user: Option<backend::User>) -> BackendResult<bool> {
Ok(user.map(|user| user.admin).unwrap_or(false))
}
/// Require that a user is logged in and an admin
/// Returns a Result with the error message if the user is not logged in or is not an admin
/// Intended to be used at the start of a protected route, to ensure the user is logged in and an admin
#[cfg(feature = "ssr")]
pub async fn require_admin() -> BackendResult<()> {
check_admin().await.and_then(|is_admin| {
if is_admin {
Ok(())
} else {
Err(AuthError::AdminRequired.into())
}
})
}
cfg_if! {
if #[cfg(feature = "ssr")] {
pub trait LoggedInUser: Sized {
fn get() -> impl std::future::Future<Output = BackendResult<Self>> + Send;
fn is_logged_in(&self) -> bool;
fn is_admin(&self) -> bool;
}
impl LoggedInUser for Option<backend::User> {
async fn get() -> BackendResult<Option<backend::User>> {
extract::<AuthSession<AuthBackend>>()
.await
.context("Error extracting auth session")
.map(|auth| auth.user)
}
fn is_logged_in(&self) -> bool {
self.is_some()
}
fn is_admin(&self) -> bool {
self.as_ref().is_some_and(|user| user.admin)
}
}
impl LoggedInUser for backend::User {
async fn get() -> BackendResult<backend::User> {
<Option<backend::User> as LoggedInUser>::get()
.await?
.ok_or(AuthError::Unauthorized.into())
}
fn is_logged_in(&self) -> bool {
true
}
fn is_admin(&self) -> bool {
self.admin
}
}
impl LoggedInUser for Option<frontend::User> {
async fn get() -> BackendResult<Option<frontend::User>> {
<Option<backend::User> as LoggedInUser>::get()
.await
.map(|user| user.map(Into::into))
}
fn is_logged_in(&self) -> bool {
self.is_some()
}
fn is_admin(&self) -> bool {
self.as_ref().is_some_and(|user| user.admin)
}
}
impl LoggedInUser for frontend::User {
async fn get() -> BackendResult<frontend::User> {
<Option<frontend::User> as LoggedInUser>::get()
.await?
.ok_or(AuthError::Unauthorized.into())
}
fn is_logged_in(&self) -> bool {
true
}
fn is_admin(&self) -> bool {
self.admin
}
}
}
}

20
src/api/health.rs Normal file
View File

@@ -0,0 +1,20 @@
use crate::prelude::*;
#[api_fn(endpoint = "health")]
pub async fn health(state: BackendState, db_conn: &mut PgPooledConn) -> BackendResult<String> {
use diesel::connection::SimpleConnection;
use tower_sessions_redis_store::fred::interfaces::ClientLike;
db_conn
.batch_execute("SELECT 1")
.context("Failed to execute database health check query")?;
state
.get_redis_conn()
.ping::<()>(None)
.await
.map_err(|e| BackendError::InternalError(format!("{e}")))
.context("Failed to execute Redis health check ping")?;
Ok("ok".to_string())
}

View File

@@ -1,44 +1,67 @@
use chrono::NaiveDateTime;
use leptos::*;
use crate::models::HistoryEntry;
use crate::models::Song;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos::server_fn::error::NoCustomError;
use crate::database::get_db_conn;
use crate::auth::get_user;
}
}
use crate::prelude::*;
/// Get the history of the current user.
#[server(endpoint = "history/get")]
pub async fn get_history(limit: Option<i64>) -> Result<Vec<HistoryEntry>, ServerFnError> {
let user = get_user().await?;
let db_con = &mut get_db_conn();
let history = user.get_history(limit, db_con)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting history: {}", e)))?;
Ok(history)
#[api_fn(endpoint = "history/get")]
pub async fn get_history(
limit: Option<i64>,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<backend::HistoryEntry>> {
if let Some(limit) = limit {
song_history::table
.filter(song_history::user_id.eq(user.id))
.order(song_history::date.desc())
.limit(limit)
.load(db_conn)
} else {
song_history::table
.filter(song_history::user_id.eq(user.id))
.order(song_history::date.desc())
.load(db_conn)
}
.context("Error getting user history")
}
/// Get the listen dates and songs of the current user.
#[server(endpoint = "history/get_songs")]
pub async fn get_history_songs(limit: Option<i64>) -> Result<Vec<(NaiveDateTime, Song)>, ServerFnError> {
let user = get_user().await?;
let db_con = &mut get_db_conn();
let songs = user.get_history_songs(limit, db_con)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting history songs: {}", e)))?;
Ok(songs)
#[api_fn(endpoint = "history/get_songs")]
pub async fn get_history_songs(
limit: Option<i64>,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<(NaiveDateTime, backend::Song)>> {
if let Some(limit) = limit {
song_history::table
.inner_join(songs::table)
.filter(song_history::user_id.eq(user.id))
.order(song_history::date.desc())
.limit(limit)
.select((song_history::date, songs::table::all_columns()))
.load(db_conn)
} else {
song_history::table
.inner_join(songs::table)
.filter(song_history::user_id.eq(user.id))
.order(song_history::date.desc())
.select((song_history::date, songs::table::all_columns()))
.load(db_conn)
}
.context("Error getting user history songs")
}
/// Add a song to the history of the current user.
#[server(endpoint = "history/add")]
pub async fn add_history(song_id: i32) -> Result<(), ServerFnError> {
let user = get_user().await?;
let db_con = &mut get_db_conn();
user.add_history(song_id, db_con)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error adding history: {}", e)))?;
Ok(())
#[api_fn(endpoint = "history/add")]
pub async fn add_history(
song_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
diesel::insert_into(song_history::table)
.values((
song_history::user_id.eq(user.id),
song_history::song_id.eq(song_id),
))
.execute(db_conn)
.context("Error adding song to history")?;
Ok(())
}

View File

@@ -1,4 +1,12 @@
pub mod history;
pub mod profile;
pub mod songs;
pub mod album;
pub mod albums;
pub mod artists;
pub mod auth;
pub mod health;
pub mod history;
pub mod playlists;
pub mod profile;
pub mod search;
pub mod songs;
pub mod upload;
pub mod users;

347
src/api/playlists.rs Normal file
View File

@@ -0,0 +1,347 @@
use crate::prelude::*;
use server_fn::codec::MultipartData;
cfg_if! {
if #[cfg(feature = "ssr")] {
use std::fs;
}
}
#[cfg(feature = "ssr")]
async fn user_owns_playlist(user_id: i32, playlist_id: i32) -> BackendResult<bool> {
let mut db_conn = BackendState::get().await?.get_db_conn()?;
let exists = playlists::table
.find(playlist_id)
.filter(playlists::owner_id.eq(user_id))
.select(playlists::id)
.first::<i32>(&mut db_conn)
.optional()
.context("Error loading playlist from database")?
.is_some();
Ok(exists)
}
#[api_fn(endpoint = "playlists/get_all")]
pub async fn get_playlists(
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<frontend::Playlist>> {
let playlists = playlists::table
.filter(playlists::owner_id.eq(user.id))
.select(playlists::all_columns)
.load::<backend::Playlist>(db_conn)
.context("Error loading playlists from database")?;
Ok(playlists
.into_iter()
.map(|playlist| playlist.into())
.collect())
}
#[cfg(feature = "ssr")]
pub async fn get_backend_playlist(playlist_id: i32) -> BackendResult<Option<backend::Playlist>> {
let user_id = api::auth::get_user()
.await
.context("Error getting logged-in user")?
.id;
let mut db_conn = BackendState::get().await?.get_db_conn()?;
playlists::table
.find(playlist_id)
.filter(playlists::owner_id.eq(user_id))
.select(playlists::all_columns)
.first(&mut db_conn)
.optional()
.context("Error loading playlist from database")
}
#[api_fn(endpoint = "playlists/get")]
pub async fn get_playlist(playlist_id: i32) -> BackendResult<Option<frontend::Playlist>> {
Ok(get_backend_playlist(playlist_id)
.await?
.map(|playlist| playlist.into()))
}
#[api_fn(endpoint = "playlists/get_songs")]
pub async fn get_playlist_songs(
playlist_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<i32>> {
// Check if the playlist exists and belongs to the user
let valid_playlist = user_owns_playlist(user.id, playlist_id)
.await
.context("Error checking if playlist exists and is owned by user")?;
if !valid_playlist {
return Err(AccessError::NotFoundOrUnauthorized
.context("Playlist does not exist or does not belong to the user"));
}
playlist_songs::table
.filter(crate::schema::playlist_songs::playlist_id.eq(playlist_id))
.select(playlist_songs::song_id)
.load(db_conn)
.context("Error loading playlist songs from database")
}
#[api_fn(endpoint = "playlists/add_song")]
pub async fn add_song_to_playlist(
playlist_id: i32,
song_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
// Check if the playlist exists and belongs to the user
let valid_playlist = user_owns_playlist(user.id, playlist_id)
.await
.context("Error checking if playlist exists and is owned by user")?;
if !valid_playlist {
return Err(AccessError::NotFoundOrUnauthorized
.context("Playlist does not exist or does not belong to the user"));
}
diesel::insert_into(crate::schema::playlist_songs::table)
.values((
playlist_songs::playlist_id.eq(playlist_id),
playlist_songs::song_id.eq(song_id),
))
.execute(db_conn)
.context("Error adding song to playlist in database")?;
Ok(())
}
#[api_fn(upload, endpoint = "playlists/create")]
pub async fn create_playlist(
data: MultipartData,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
use image_convert::{ImageResource, WEBPConfig, to_webp};
// Safe to unwrap - "On the server side, this always returns Some(_). On the client side, always returns None."
let mut data = data.into_inner().unwrap();
let mut playlist_name = None;
let mut picture_data = None;
while let Ok(Some(field)) = data.next_field().await {
let name = field.name().unwrap_or_default().to_string();
match name.as_str() {
"name" => {
playlist_name = Some(extract_field(field).await?);
}
"picture" => {
// Read the image
let bytes = field
.bytes()
.await
.map_err(|e| InputError::FieldReadError(format!("{e}")))
.context("Error reading bytes of the picture field")?;
// Check if the image is empty
if !bytes.is_empty() {
let reader = std::io::Cursor::new(bytes);
let image_source = ImageResource::from_reader(reader)
.context("Error creating image resource from reader")?;
picture_data = Some(image_source);
}
}
_ => {
warn!("Unknown playlist creation field: {name}");
}
}
}
// Unwrap mandatory fields
let name = playlist_name.ok_or_else(|| {
InputError::MissingField("name".to_string()).context("Missing playlist name")
})?;
let new_playlist = backend::NewPlaylist {
name: name.clone(),
owner_id: user.id,
image_path: None,
};
// Create a transaction to create the playlist
// If saving the image fails, the playlist will not be created
db_conn.transaction(|db_conn| {
let playlist = diesel::insert_into(playlists::table)
.values(&new_playlist)
.get_result::<backend::Playlist>(db_conn)
.context("Error creating playlist in database")?;
// If a picture was provided, save it to the database
if let Some(image_source) = picture_data {
let image_path = format!("assets/images/playlist/{}.webp", playlist.id);
let mut image_target = ImageResource::from_path(&image_path);
to_webp(&mut image_target, &image_source, &WEBPConfig::new())
.map_err(|e| InputError::InvalidInput(format!("{e}")))
.context("Error converting image to webp")?;
}
Ok::<(), BackendError>(())
})
}
#[api_fn(upload, endpoint = "playlists/edit_image")]
pub async fn edit_playlist_image(
data: MultipartData,
state: BackendState,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
use image_convert::{ImageResource, WEBPConfig, to_webp};
// Safe to unwrap - "On the server side, this always returns Some(_). On the client side, always returns None."
let mut data = data.into_inner().unwrap();
let mut playlist_id = None;
let mut picture_data = None;
while let Ok(Some(field)) = data.next_field().await {
let name = field.name().unwrap_or_default().to_string();
match name.as_str() {
"id" => {
playlist_id = Some(extract_field(field).await?);
}
"picture" => {
// Read the image
let bytes = field
.bytes()
.await
.map_err(|e| InputError::FieldReadError(format!("{e}")))
.context("Error reading bytes of the picture field")?;
// Check if the image is empty
if !bytes.is_empty() {
let reader = std::io::Cursor::new(bytes);
let image_source = ImageResource::from_reader(reader)
.context("Error creating image resource from reader")?;
picture_data = Some(image_source);
}
}
_ => {
warn!("Unknown playlist creation field: {name}");
}
}
}
// Unwrap mandatory fields
let playlist_id = playlist_id
.ok_or_else(|| InputError::MissingField("id".to_string()).context("Missing playlist ID"))?;
let playlist_id: i32 = playlist_id
.parse()
.map_err(|e| InputError::InvalidInput(format!("Invalid playlist ID: {e}")))
.context("Error parsing playlist ID from string")?;
let Some(image_source) = picture_data else {
return Err(
InputError::MissingField("picture".to_string()).context("Missing playlist image")
);
};
let playlist = get_backend_playlist(playlist_id)
.await
.context("Failed getting playlist for image edit")?;
let Some(playlist) = playlist else {
return Err(AccessError::NotFoundOrUnauthorized
.context("Playlist does not exist or does not belong to the user"));
};
// If a picture was provided, save it to the database
let base_path = state.get_asset_path(&AssetType::Image);
let relative_path = AssetType::Image.new_path("webp");
let full_path = base_path.join(relative_path.clone().path());
let parent_path = full_path
.parent()
.ok_or(BackendError::InternalError(format!(
"Unable to get parent of path {}",
full_path.display()
)))?;
fs::create_dir_all(parent_path)
.context("Failed to create parent directories for new playlist image")?;
let mut image_target = ImageResource::from_path(&full_path);
to_webp(&mut image_target, &image_source, &WEBPConfig::new())
.map_err(|e| InputError::InvalidInput(format!("{e}")))
.inspect_err(|e| error!("Error: {e}"))
.context("Error converting image to webp")?;
diesel::update(playlists::table.find(playlist_id))
.set(playlists::image_path.eq(relative_path))
.execute(db_conn)
.context("Error changing playlist image path in database")?;
if let Some(previous_image) = playlist.image_path {
let full_previous_image_path = base_path.join(previous_image.path());
fs::remove_file(full_previous_image_path).context("Failed to delete old playlist image")?;
}
Ok(())
}
#[api_fn(endpoint = "playlists/delete")]
pub async fn delete_playlist(
playlist_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
// Check if the playlist exists and belongs to the user
let valid_playlist = user_owns_playlist(user.id, playlist_id)
.await
.context("Error checking if playlist exists and is owned by user")?;
if !valid_playlist {
return Err(AccessError::NotFoundOrUnauthorized
.context("Playlist does not exist or does not belong to the user"));
}
diesel::delete(playlists::table.find(playlist_id))
.execute(db_conn)
.context("Error deleting playlist from database")?;
Ok(())
}
#[api_fn(endpoint = "playlists/rename")]
pub async fn rename_playlist(
id: i32,
new_name: String,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
// Check if the playlist exists and belongs to the user
let valid_playlist = user_owns_playlist(user.id, id)
.await
.context("Error checking if playlist exists and is owned by user")?;
if !valid_playlist {
return Err(AccessError::NotFoundOrUnauthorized.into());
}
diesel::update(playlists::table.find(id))
.set(playlists::name.eq(new_name))
.execute(db_conn)
.context("Error renaming playlist in database")?;
Ok(())
}

View File

@@ -1,64 +1,87 @@
use leptos::*;
use server_fn::codec::{MultipartData, MultipartFormData};
use crate::prelude::*;
use cfg_if::cfg_if;
use crate::songdata::SongData;
use crate::artistdata::ArtistData;
use chrono::NaiveDateTime;
use server_fn::codec::MultipartData;
cfg_if! {
if #[cfg(feature = "ssr")] {
use crate::auth::get_user;
use server_fn::error::NoCustomError;
use crate::database::get_db_conn;
use diesel::prelude::*;
use diesel::dsl::count;
use crate::models::*;
use crate::schema::*;
use std::collections::HashMap;
}
if #[cfg(feature = "ssr")] {
use std::fs;
}
}
/// Handle a user uploading a profile picture. Converts the image to webp and saves it to the server.
#[server(input = MultipartFormData, endpoint = "/profile/upload_picture")]
pub async fn upload_picture(data: MultipartData) -> Result<(), ServerFnError> {
// Safe to unwrap - "On the server side, this always returns Some(_). On the client side, always returns None."
let mut data = data.into_inner().unwrap();
#[api_fn(upload, endpoint = "/profile/upload_picture")]
pub async fn upload_picture(
data: MultipartData,
user: backend::User,
db_conn: &mut PgPooledConn,
state: BackendState,
) -> BackendResult<()> {
// Safe to unwrap - "On the server side, this always returns Some(_). On the client side, always returns None."
let mut data = data.into_inner().unwrap();
let field = data.next_field().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting field: {}", e)))?
.ok_or_else(|| ServerFnError::<NoCustomError>::ServerError("No field found".to_string()))?;
let field = data
.next_field()
.await
.map_err(|e| InputError::InvalidInput(format!("Error reading multipart data: {e}")))
.context("Error getting next field from multipart data")?
.ok_or_else(|| {
InputError::InvalidInput("Expected a field in the multipart data".to_string())
})?;
if field.name() != Some("picture") {
return Err(ServerFnError::ServerError("Field name is not 'picture'".to_string()));
}
if field.name() != Some("picture") {
return Err(InputError::InvalidInput(format!(
"Expected field 'picture', got '{:?}'",
field.name()
))
.into());
}
// Get user id from session
let user = get_user().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting user: {}", e)))?;
// Read the image, and convert it to webp
use image_convert::{ImageResource, WEBPConfig, to_webp};
let user_id = user.id.ok_or_else(|| ServerFnError::<NoCustomError>::ServerError("User has no id".to_string()))?;
let bytes = field
.bytes()
.await
.map_err(|e| InputError::InvalidInput(format!("Error reading bytes from field: {e}")))
.context("Error reading bytes of the picture field")?;
// Read the image, and convert it to webp
use image_convert::{to_webp, WEBPConfig, ImageResource};
let reader = std::io::Cursor::new(bytes);
let image_source = ImageResource::from_reader(reader)
.map_err(|e| InputError::InvalidInput(format!("Error creating image resource: {e}")))
.context("Error creating image resource from reader")?;
let bytes = field.bytes().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting field bytes: {}", e)))?;
let base_path = state.get_asset_path(&AssetType::Image);
let reader = std::io::Cursor::new(bytes);
let image_source = ImageResource::from_reader(reader)
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error creating image resource: {}", e)))?;
let relative_path = AssetType::Image.new_path("webp");
let profile_picture_path = format!("assets/images/profile/{}.webp", user_id);
let mut image_target = ImageResource::from_path(&profile_picture_path);
to_webp(&mut image_target, &image_source, &WEBPConfig::new())
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error converting image to webp: {}", e)))?;
let full_path = base_path.join(relative_path.clone().path());
Ok(())
let parent_path = full_path
.parent()
.ok_or(BackendError::InternalError(format!(
"Unable to get parent of path {}",
full_path.display()
)))?;
fs::create_dir_all(parent_path)
.context("Failed to create parent directories for new user image")?;
let mut image_target = ImageResource::from_path(&full_path);
to_webp(&mut image_target, &image_source, &WEBPConfig::new())
.map_err(|e| InputError::InvalidInput(format!("Error converting image to webp: {e}")))?;
diesel::update(users::table.find(user.id))
.set(users::image_path.eq(relative_path))
.execute(db_conn)
.context("Error changing user image path in database")?;
if let Some(previous_image) = user.image_path {
let full_previous_image_path = base_path.join(previous_image.path());
fs::remove_file(full_previous_image_path).context("Failed to delete old user image")?;
}
Ok(())
}
/// Get a user's recent songs listened to
@@ -66,235 +89,130 @@ pub async fn upload_picture(data: MultipartData) -> Result<(), ServerFnError> {
/// If not provided, all songs ever listend to are returned.
/// Returns a list of tuples with the date the song was listened to
/// and the song data, sorted by date (most recent first).
#[server(endpoint = "/profile/recent_songs")]
pub async fn recent_songs(for_user_id: i32, limit: Option<i64>) -> Result<Vec<(NaiveDateTime, SongData)>, ServerFnError> {
let mut db_con = get_db_conn();
// Get the ids of the most recent songs listened to
let history_items: Vec<i32> =
if let Some(limit) = limit {
song_history::table
.filter(song_history::user_id.eq(for_user_id))
.order(song_history::date.desc())
.limit(limit)
.select(song_history::id)
.load(&mut db_con)?
} else {
song_history::table
.filter(song_history::user_id.eq(for_user_id))
.order(song_history::date.desc())
.select(song_history::id)
.load(&mut db_con)?
};
// Take the history ids and get the song data for them
let history: Vec<(HistoryEntry, Song, Option<Album>, Option<Artist>, Option<(i32, i32)>, Option<(i32, i32)>)>
= song_history::table
.filter(song_history::id.eq_any(history_items))
.inner_join(songs::table)
.left_join(albums::table.on(songs::album_id.eq(albums::id.nullable())))
.left_join(song_artists::table.inner_join(artists::table).on(songs::id.eq(song_artists::song_id)))
.left_join(song_likes::table.on(songs::id.eq(song_likes::song_id).and(song_likes::user_id.eq(for_user_id))))
.left_join(song_dislikes::table.on(
songs::id.eq(song_dislikes::song_id).and(song_dislikes::user_id.eq(for_user_id))))
.select((
song_history::all_columns,
songs::all_columns,
albums::all_columns.nullable(),
artists::all_columns.nullable(),
song_likes::all_columns.nullable(),
song_dislikes::all_columns.nullable(),
))
.load(&mut db_con)?;
// Process the history data into a map of song ids to song data
let mut history_songs: HashMap<i32, (NaiveDateTime, SongData)> = HashMap::with_capacity(history.len());
for (history, song, album, artist, like, dislike) in history {
let song_id = history.song_id;
if let Some((_, stored_songdata)) = history_songs.get_mut(&song_id) {
// If the song is already in the map, update the artists
if let Some(artist) = artist {
stored_songdata.artists.push(artist);
}
} else {
let like_dislike = match (like, dislike) {
(Some(_), Some(_)) => Some((true, true)),
(Some(_), None) => Some((true, false)),
(None, Some(_)) => Some((false, true)),
_ => None,
};
let image_path = song.image_path.unwrap_or(
album.as_ref().map(|album| album.image_path.clone()).flatten()
.unwrap_or("/assets/images/placeholders/MusicPlaceholder.svg".to_string()));
let songdata = SongData {
id: song_id,
title: song.title,
artists: artist.map(|artist| vec![artist]).unwrap_or_default(),
album: album,
track: song.track,
duration: song.duration,
release_date: song.release_date,
song_path: song.storage_path,
image_path: image_path,
like_dislike: like_dislike,
};
history_songs.insert(song_id, (history.date, songdata));
}
}
// Sort the songs by date
let mut history_songs: Vec<(NaiveDateTime, SongData)> = history_songs.into_values().collect();
history_songs.sort_by(|a, b| b.0.cmp(&a.0));
Ok(history_songs)
#[api_fn(endpoint = "/profile/recent_songs")]
pub async fn recent_songs(
for_user_id: i32,
limit: Option<i64>,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<(i32, NaiveDateTime)>> {
if let Some(limit) = limit {
song_history::table
.filter(song_history::user_id.eq(for_user_id))
.order(song_history::date.desc())
.select((song_history::song_id, song_history::date))
.limit(limit)
.load(db_conn)
} else {
song_history::table
.filter(song_history::user_id.eq(for_user_id))
.order(song_history::date.desc())
.select((song_history::song_id, song_history::date))
.load(db_conn)
}
.context("Failed to get user recent songs")
}
/// Get a user's top songs by play count from a date range
/// Optionally takes a limit parameter to limit the number of songs returned.
/// If not provided, all songs listened to in the date range are returned.
/// Returns a list of tuples with the play count and the song data, sorted by play count (most played first).
#[server(endpoint = "/profile/top_songs")]
pub async fn top_songs(for_user_id: i32, start_date: NaiveDateTime, end_date: NaiveDateTime, limit: Option<i64>)
-> Result<Vec<(i64, SongData)>, ServerFnError>
{
let mut db_con = get_db_conn();
// Get the play count and ids of the songs listened to in the date range
let history_counts: Vec<(i32, i64)> =
if let Some(limit) = limit {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.group_by(song_history::song_id)
.select((song_history::song_id, count(song_history::song_id)))
.order(count(song_history::song_id).desc())
.limit(limit)
.load(&mut db_con)?
} else {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.group_by(song_history::song_id)
.select((song_history::song_id, count(song_history::song_id)))
.load(&mut db_con)?
};
let history_counts: HashMap<i32, i64> = history_counts.into_iter().collect();
let history_song_ids = history_counts.iter().map(|(song_id, _)| *song_id).collect::<Vec<i32>>();
// Get the song data for the songs listened to in the date range
let history_songs: Vec<(Song, Option<Album>, Option<Artist>, Option<(i32, i32)>, Option<(i32, i32)>)>
= songs::table
.filter(songs::id.eq_any(history_song_ids))
.left_join(albums::table.on(songs::album_id.eq(albums::id.nullable())))
.left_join(song_artists::table.inner_join(artists::table).on(songs::id.eq(song_artists::song_id)))
.left_join(song_likes::table.on(songs::id.eq(song_likes::song_id).and(song_likes::user_id.eq(for_user_id))))
.left_join(song_dislikes::table.on(
songs::id.eq(song_dislikes::song_id).and(song_dislikes::user_id.eq(for_user_id))))
.select((
songs::all_columns,
albums::all_columns.nullable(),
artists::all_columns.nullable(),
song_likes::all_columns.nullable(),
song_dislikes::all_columns.nullable(),
))
.load(&mut db_con)?;
// Process the history data into a map of song ids to song data
let mut history_songs_map: HashMap<i32, (i64, SongData)> = HashMap::with_capacity(history_counts.len());
for (song, album, artist, like, dislike) in history_songs {
let song_id = song.id
.ok_or(ServerFnError::ServerError::<NoCustomError>("Song id not found in database".to_string()))?;
if let Some((_, stored_songdata)) = history_songs_map.get_mut(&song_id) {
// If the song is already in the map, update the artists
if let Some(artist) = artist {
stored_songdata.artists.push(artist);
}
} else {
let like_dislike = match (like, dislike) {
(Some(_), Some(_)) => Some((true, true)),
(Some(_), None) => Some((true, false)),
(None, Some(_)) => Some((false, true)),
_ => None,
};
let image_path = song.image_path.unwrap_or(
album.as_ref().map(|album| album.image_path.clone()).flatten()
.unwrap_or("/assets/images/placeholders/MusicPlaceholder.svg".to_string()));
let songdata = SongData {
id: song_id,
title: song.title,
artists: artist.map(|artist| vec![artist]).unwrap_or_default(),
album: album,
track: song.track,
duration: song.duration,
release_date: song.release_date,
song_path: song.storage_path,
image_path: image_path,
like_dislike: like_dislike,
};
let plays = history_counts.get(&song_id)
.ok_or(ServerFnError::ServerError::<NoCustomError>("Song id not found in history counts".to_string()))?;
history_songs_map.insert(song_id, (*plays, songdata));
}
}
// Sort the songs by play count
let mut history_songs: Vec<(i64, SongData)> = history_songs_map.into_values().collect();
history_songs.sort_by(|a, b| b.0.cmp(&a.0));
Ok(history_songs)
#[api_fn(endpoint = "/profile/top_songs")]
pub async fn top_songs(
for_user_id: i32,
start_date: NaiveDateTime,
end_date: NaiveDateTime,
limit: Option<i64>,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<(i32, i64)>> {
// Get the play count and ids of the songs listened to in the date range
if let Some(limit) = limit {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.group_by(song_history::song_id)
.select((
song_history::song_id,
diesel::dsl::count(song_history::song_id),
))
.order(diesel::dsl::count(song_history::song_id).desc())
.limit(limit)
.load(db_conn)
} else {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.group_by(song_history::song_id)
.select((
song_history::song_id,
diesel::dsl::count(song_history::song_id),
))
.load(db_conn)
}
.context("Failed to get user top songs")
}
/// Get a user's top artists by play count from a date range
/// Optionally takes a limit parameter to limit the number of artists returned.
/// If not provided, all artists listened to in the date range are returned.
/// Returns a list of tuples with the play count and the artist data, sorted by play count (most played first).
#[server(endpoint = "/profile/top_artists")]
pub async fn top_artists(for_user_id: i32, start_date: NaiveDateTime, end_date: NaiveDateTime, limit: Option<i64>)
-> Result<Vec<(i64, ArtistData)>, ServerFnError>
{
let mut db_con = get_db_conn();
#[api_fn(endpoint = "/profile/top_artists")]
pub async fn top_artists(
for_user_id: i32,
start_date: NaiveDateTime,
end_date: NaiveDateTime,
limit: Option<i64>,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<(i64, frontend::Artist)>> {
let artist_counts: Vec<(i64, backend::Artist)> = if let Some(limit) = limit {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.inner_join(song_artists::table.on(song_history::song_id.eq(song_artists::song_id)))
.inner_join(artists::table.on(song_artists::artist_id.eq(artists::id)))
.group_by(artists::id)
.select((diesel::dsl::count(artists::id), artists::all_columns))
.order(diesel::dsl::count(artists::id).desc())
.limit(limit)
.load(db_conn)
.context("Error loading top artists from database")?
} else {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.inner_join(song_artists::table.on(song_history::song_id.eq(song_artists::song_id)))
.inner_join(artists::table.on(song_artists::artist_id.eq(artists::id)))
.group_by(artists::id)
.select((diesel::dsl::count(artists::id), artists::all_columns))
.order(diesel::dsl::count(artists::id).desc())
.load(db_conn)
.context("Error loading top artists from database")?
};
let artist_counts: Vec<(i64, Artist)> =
if let Some(limit) = limit {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.inner_join(song_artists::table.on(song_history::song_id.eq(song_artists::song_id)))
.inner_join(artists::table.on(song_artists::artist_id.eq(artists::id)))
.group_by(artists::id)
.select((count(artists::id), artists::all_columns))
.order(count(artists::id).desc())
.limit(limit)
.load(&mut db_con)?
} else {
song_history::table
.filter(song_history::date.between(start_date, end_date))
.filter(song_history::user_id.eq(for_user_id))
.inner_join(song_artists::table.on(song_history::song_id.eq(song_artists::song_id)))
.inner_join(artists::table.on(song_artists::artist_id.eq(artists::id)))
.group_by(artists::id)
.select((count(artists::id), artists::all_columns))
.order(count(artists::id).desc())
.load(&mut db_con)?
};
let artist_data: Vec<(i64, frontend::Artist)> = artist_counts
.into_iter()
.map(|(plays, artist)| {
(
plays,
frontend::Artist {
id: artist.id,
name: artist.name,
image_path: LocalPath::to_web_path_or_placeholder(artist.image_path),
},
)
})
.collect();
let artist_data: Vec<(i64, ArtistData)> = artist_counts.into_iter().map(|(plays, artist)| {
(plays, ArtistData {
id: artist.id.unwrap(),
name: artist.name,
image_path: format!("/assets/images/artists/{}.webp", artist.id.unwrap()),
})
}).collect();
Ok(artist_data)
Ok(artist_data)
}
#[api_fn(endpoint = "/profile/liked_songs")]
pub async fn get_liked_songs(
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<i32>> {
crate::schema::song_likes::table
.filter(crate::schema::song_likes::user_id.eq(user.id))
.select(song_likes::song_id)
.load(db_conn)
.context("Error loading liked songs from database")
}

191
src/api/search.rs Normal file
View File

@@ -0,0 +1,191 @@
use crate::prelude::*;
cfg_if! {
if #[cfg(feature = "ssr")] {
use diesel::sql_types::*;
use diesel::pg::Pg;
use diesel::expression::AsExpression;
// Define pg_trgm operators
// Functions do not use indices for queries, so we need to use operators
diesel::infix_operator!(Similarity, " % ", backend: Pg);
diesel::infix_operator!(Distance, " <-> ", Float, backend: Pg);
// Create functions to make use of the operators in queries
fn trgm_similar<T: AsExpression<Text>, U: AsExpression<Text>>(left: T, right: U)
-> Similarity<T::Expression, U::Expression> {
Similarity::new(left.as_expression(), right.as_expression())
}
fn trgm_distance<T: AsExpression<Text>, U: AsExpression<Text>>(left: T, right: U)
-> Distance<T::Expression, U::Expression> {
Distance::new(left.as_expression(), right.as_expression())
}
}
}
/// A simple type for search results
/// A vector of tuples containing the item and its score
pub type SearchResults<T> = Vec<(T, f32)>;
/// Turn `SearchResults` into just the results, discarding scores
pub fn remove_search_score<T>(results: SearchResults<T>) -> Vec<T> {
results.into_iter().map(|(item, _score)| item).collect()
}
/// Search for albums by title
///
/// # Arguments
/// `query` - The search query. This will be used to perform a fuzzy search on the album titles
/// `limit` - The maximum number of results to return
///
/// # Returns
/// A Result containing a vector of albums if the search was successful, or an error if the search failed
#[api_fn(endpoint = "search_albums")]
pub async fn search_albums(
query: String,
limit: i64,
db_conn: &mut PgPooledConn,
) -> BackendResult<SearchResults<frontend::Album>> {
let album_ids = albums::table
.filter(trgm_similar(albums::title, query.clone()))
.order_by(trgm_distance(albums::title, query.clone()).desc())
.limit(limit)
.select(albums::id)
.into_boxed();
let mut albums_map: HashMap<i32, (frontend::Album, f32)> = HashMap::new();
let album_artists: Vec<(backend::Album, Option<backend::Artist>, f32)> = albums::table
.filter(albums::id.eq_any(album_ids))
.left_join(
album_artists::table
.inner_join(artists::table)
.on(albums::id.eq(album_artists::album_id)),
)
.select((
albums::all_columns,
artists::all_columns.nullable(),
trgm_distance(albums::title, query.clone()),
))
.load(db_conn)
.context("Error loading album artists from database")?;
for (album, artist, score) in album_artists {
if let Some((stored_album, _score)) = albums_map.get_mut(&album.id) {
if let Some(artist) = artist {
stored_album.artists.push(artist);
}
} else {
let albumdata = frontend::Album {
id: album.id,
title: album.title,
artists: artist.map(|artist| vec![artist]).unwrap_or(vec![]),
release_date: album.release_date,
image_path: LocalPath::to_web_path_or_placeholder(album.image_path),
};
albums_map.insert(album.id, (albumdata, score));
}
}
let mut albums: Vec<(frontend::Album, f32)> = albums_map.into_values().collect();
albums.sort_by(|(_a, a_score), (_b, b_score)| b_score.total_cmp(a_score));
Ok(albums)
}
/// Search for artists by name
///
/// # Arguments
/// `query` - The search query. This will be used to perform a fuzzy search on the artist names
/// `limit` - The maximum number of results to return
///
/// # Returns
/// A Result containing a vector of artists if the search was successful, or an error if the search failed
#[api_fn(endpoint = "search_artists")]
pub async fn search_artists(
query: String,
limit: i64,
db_conn: &mut PgPooledConn,
) -> BackendResult<SearchResults<frontend::Artist>> {
let artist_list = artists::table
.filter(trgm_similar(artists::name, query.clone()))
.order_by(trgm_distance(artists::name, query.clone()).desc())
.limit(limit)
.select((artists::all_columns, trgm_distance(artists::name, query)))
.load::<(backend::Artist, f32)>(db_conn)
.context("Error loading artists from database")?;
let artist_data = artist_list
.into_iter()
.map(|(artist, score)| {
(
frontend::Artist {
id: artist.id,
name: artist.name,
image_path: LocalPath::to_web_path_or_placeholder(artist.image_path),
},
score,
)
})
.collect();
Ok(artist_data)
}
/// Search for songs by title
///
/// # Arguments
/// `query` - The search query. This will be used to perform a fuzzy search on the song titles
/// `limit` - The maximum number of results to return
///
/// # Returns
/// A Result containing a vector of songs if the search was successful, or an error if the search failed
#[api_fn(endpoint = "search_songs")]
pub async fn search_songs(
query: String,
limit: i64,
db_conn: &mut PgPooledConn,
) -> BackendResult<SearchResults<i32>> {
songs::table
.filter(trgm_similar(songs::title, query.clone()))
.order_by(trgm_distance(songs::title, query.clone()).desc())
.limit(limit)
.select((songs::id, trgm_distance(songs::title, query.clone())))
.load(db_conn)
.context("Failed to search songs")
}
/// Search for songs, albums, and artists by title or name
///
/// # Arguments
/// `query` - The search query. This will be used to perform a fuzzy search on the
/// song titles, album titles, and artist names
/// `limit` - The maximum number of results to return for each type
///
/// # Returns
/// A Result containing a tuple of vectors of albums, artists, and songs if the search was successful,
#[api_fn(endpoint = "search")]
pub async fn search(
query: String,
limit: i64,
) -> BackendResult<(
SearchResults<frontend::Album>,
SearchResults<frontend::Artist>,
SearchResults<i32>,
)> {
let albums = search_albums(query.clone(), limit);
let artists = search_artists(query.clone(), limit);
let songs = search_songs(query, limit);
use tokio::join;
let (albums, artists, songs) = join!(albums, artists, songs);
let albums = albums.context("Error searching for albums")?;
let artists = artists.context("Error searching for artists")?;
let songs = songs.context("Error searching for songs")?;
Ok((albums, artists, songs))
}

View File

@@ -1,55 +1,312 @@
use leptos::*;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos::server_fn::error::NoCustomError;
use crate::database::get_db_conn;
use crate::auth::get_user;
}
}
use crate::prelude::*;
/// Like or unlike a song
#[server(endpoint = "songs/set_like")]
pub async fn set_like_song(song_id: i32, like: bool) -> Result<(), ServerFnError> {
let user = get_user().await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error getting user: {}", e)))?;
let db_con = &mut get_db_conn();
#[api_fn(endpoint = "songs/set_like")]
pub async fn set_like_song(
song_id: i32,
like: bool,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
if like {
diesel::insert_into(song_likes::table)
.values((
song_likes::song_id.eq(song_id),
song_likes::user_id.eq(user.id),
))
.execute(db_conn)
.context("Error liking song")?;
user.set_like_song(song_id, like, db_con).await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error liking song: {}", e)))
// Remove dislike if it exists
diesel::delete(
song_dislikes::table.filter(
song_dislikes::song_id
.eq(song_id)
.and(song_dislikes::user_id.eq(user.id)),
),
)
.execute(db_conn)
.context("Error removing dislike for song")?;
} else {
diesel::delete(
song_likes::table.filter(
song_likes::song_id
.eq(song_id)
.and(song_likes::user_id.eq(user.id)),
),
)
.execute(db_conn)
.context("Error removing like for song")?;
}
Ok(())
}
/// Dislike or remove dislike from a song
#[server(endpoint = "songs/set_dislike")]
pub async fn set_dislike_song(song_id: i32, dislike: bool) -> Result<(), ServerFnError> {
let user = get_user().await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error getting user: {}", e)))?;
let db_con = &mut get_db_conn();
#[api_fn(endpoint = "songs/set_dislike")]
pub async fn set_dislike_song(
song_id: i32,
dislike: bool,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<()> {
if dislike {
diesel::insert_into(song_dislikes::table)
.values((
song_dislikes::song_id.eq(song_id),
song_dislikes::user_id.eq(user.id),
))
.execute(db_conn)
.context("Error disliking song")?;
user.set_dislike_song(song_id, dislike, db_con).await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error disliking song: {}", e)))
// Remove like if it exists
diesel::delete(
song_likes::table.filter(
song_likes::song_id
.eq(song_id)
.and(song_likes::user_id.eq(user.id)),
),
)
.execute(db_conn)
.context("Error removing like for song")?;
} else {
diesel::delete(
song_dislikes::table.filter(
song_dislikes::song_id
.eq(song_id)
.and(song_dislikes::user_id.eq(user.id)),
),
)
.execute(db_conn)
.context("Error removing dislike for song")?;
}
Ok(())
}
/// Get the like and dislike status of a song
#[server(endpoint = "songs/get_like_dislike")]
pub async fn get_like_dislike_song(song_id: i32) -> Result<(bool, bool), ServerFnError> {
let user = get_user().await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error getting user: {}", e)))?;
#[api_fn(endpoint = "songs/get_like_dislike")]
pub async fn get_like_dislike_song(
song_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<(bool, bool)> {
// TODO this could probably be done more efficiently with a tokio::try_join, but
// doing so is much more complicated than it would initially seem
let db_con = &mut get_db_conn();
let like = song_likes::table
.filter(
song_likes::song_id
.eq(song_id)
.and(song_likes::user_id.eq(user.id)),
)
.first::<(i32, i32)>(db_conn)
.optional()
.context("Error checking if song is liked")?
.is_some();
// TODO this could probably be done more efficiently with a tokio::try_join, but
// doing so is much more complicated than it would initially seem
let dislike = song_dislikes::table
.filter(
song_dislikes::song_id
.eq(song_id)
.and(song_dislikes::user_id.eq(user.id)),
)
.first::<(i32, i32)>(db_conn)
.optional()
.context("Error checking if song is disliked")?
.is_some();
let like = user.get_like_song(song_id, db_con).await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error getting song liked: {}", e)))?;
let dislike = user.get_dislike_song(song_id, db_con).await.map_err(|e| ServerFnError::<NoCustomError>::
ServerError(format!("Error getting song disliked: {}", e)))?;
Ok((like, dislike))
Ok((like, dislike))
}
#[api_fn(endpoint = "songs/get")]
pub async fn get_song_by_id(
song_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::Song>> {
let song_parts: Vec<(
backend::Song,
Option<backend::Album>,
Option<backend::Artist>,
Option<(i32, i32)>,
Option<(i32, i32)>,
)> = songs::table
.find(song_id)
.left_join(albums::table.on(songs::album_id.eq(albums::id.nullable())))
.left_join(
song_artists::table
.inner_join(artists::table)
.on(songs::id.eq(song_artists::song_id)),
)
.left_join(
song_likes::table.on(songs::id
.eq(song_likes::song_id)
.and(song_likes::user_id.eq(user.id))),
)
.left_join(
song_dislikes::table.on(songs::id
.eq(song_dislikes::song_id)
.and(song_dislikes::user_id.eq(user.id))),
)
.select((
songs::all_columns,
albums::all_columns.nullable(),
artists::all_columns.nullable(),
song_likes::all_columns.nullable(),
song_dislikes::all_columns.nullable(),
))
.load(db_conn)
.context("Error loading song from database")?;
let song = song_parts.first().cloned();
let artists = song_parts
.into_iter()
.filter_map(|(_, _, artist, _, _)| artist)
.collect::<Vec<_>>();
match song {
Some((song, album, _artist, like, dislike)) => {
let image_path = song.image_web_path_or_placeholder(album.as_ref());
let song_path = song
.storage_path
.to_web_path(AssetType::Audio)
.context(format!(
"Error converting audio path to web path for song {} (id: {})",
song.title.clone(),
song.id
))?;
Ok(Some(frontend::Song {
id: song.id,
title: song.title.clone(),
artists,
album: album.clone(),
track: song.track,
duration: song.duration,
release_date: song.release_date,
song_path,
image_path,
like_dislike: Some((like.is_some(), dislike.is_some())),
added_date: song.added_date,
}))
}
None => Ok(None),
}
}
#[api_fn(endpoint = "songs/get_many")]
pub async fn get_songs_by_id(
song_ids: Vec<i32>,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<Vec<Option<frontend::Song>>> {
let song_parts: Vec<(
backend::Song,
Option<backend::Album>,
Option<backend::Artist>,
Option<(i32, i32)>,
Option<(i32, i32)>,
)> = songs::table
.filter(songs::id.eq_any(song_ids.clone()))
.left_join(albums::table.on(songs::album_id.eq(albums::id.nullable())))
.left_join(
song_artists::table
.inner_join(artists::table)
.on(songs::id.eq(song_artists::song_id)),
)
.left_join(
song_likes::table.on(songs::id
.eq(song_likes::song_id)
.and(song_likes::user_id.eq(user.id))),
)
.left_join(
song_dislikes::table.on(songs::id
.eq(song_dislikes::song_id)
.and(song_dislikes::user_id.eq(user.id))),
)
.select((
songs::all_columns,
albums::all_columns.nullable(),
artists::all_columns.nullable(),
song_likes::all_columns.nullable(),
song_dislikes::all_columns.nullable(),
))
.load(db_conn)
.context("Error loading song from database")?;
let mut songs: HashMap<i32, frontend::Song> = HashMap::new();
for (song, album, artist, like, dislike) in song_parts {
if let Some(last_song) = songs.get_mut(&song.id) {
if let Some(artist) = artist {
last_song.artists.push(artist);
}
} else {
let image_path = song.image_web_path_or_placeholder(album.as_ref());
let song_path = song
.storage_path
.to_web_path(AssetType::Audio)
.context(format!(
"Error converting audio path to web path for song {} (id: {})",
song.title.clone(),
song.id
))?;
let new_song = frontend::Song {
id: song.id,
title: song.title.clone(),
artists: artist.map(|artist| vec![artist]).unwrap_or(vec![]),
album: album.clone(),
track: song.track,
duration: song.duration,
release_date: song.release_date,
song_path,
image_path,
like_dislike: Some((like.is_some(), dislike.is_some())),
added_date: song.added_date,
};
songs.insert(song.id, new_song);
}
}
let songs = song_ids
.iter()
.map(|song_id| songs.remove(song_id))
.collect();
Ok(songs)
}
#[api_fn(endpoint = "songs/plays")]
pub async fn get_song_plays(song_id: i32, db_conn: &mut PgPooledConn) -> BackendResult<i64> {
let plays = song_history::table
.filter(song_history::song_id.eq(song_id))
.count()
.get_result::<i64>(db_conn)
.context("Error getting song plays")?;
Ok(plays)
}
#[api_fn(endpoint = "songs/my-plays")]
pub async fn get_my_song_plays(
song_id: i32,
user: backend::User,
db_conn: &mut PgPooledConn,
) -> BackendResult<i64> {
let plays = song_history::table
.filter(
song_history::song_id
.eq(song_id)
.and(song_history::user_id.eq(user.id)),
)
.count()
.get_result::<i64>(db_conn)
.context("Error getting song plays for user")?;
Ok(plays)
}

324
src/api/upload.rs Normal file
View File

@@ -0,0 +1,324 @@
use crate::prelude::*;
use server_fn::codec::MultipartData;
cfg_if! {
if #[cfg(feature = "ssr")] {
use multer::Field;
use std::fs;
}
}
/// Validate the artist ids in a multipart field
/// Expects a field with a comma-separated list of artist ids, and ensures each is a valid artist id in the database
#[cfg(feature = "ssr")]
async fn validate_artist_ids(artist_ids: Field<'static>) -> BackendResult<Vec<i32>> {
use diesel::result::Error::NotFound;
// Extract the artist id from the field
match artist_ids.text().await {
Ok(artist_ids) => {
let artist_ids = artist_ids.trim_end_matches(',').split(',');
let mut db_conn = BackendState::get().await?.get_db_conn()?;
artist_ids
.filter(|artist_id| !artist_id.is_empty())
.map(|artist_id| {
// Parse the artist id as an integer
if let Ok(artist_id) = artist_id.parse::<i32>() {
// Check if the artist exists
let artist = crate::schema::artists::dsl::artists
.find(artist_id)
.first::<backend::Artist>(&mut db_conn);
match artist {
Ok(_) => Ok(artist_id),
Err(NotFound) => Err(AccessError::NotFound.context("Artist not found")),
Err(e) => Err(e.context("Error finding artist id")),
}
} else {
Err(InputError::InvalidInput("Error parsing artist id".to_string()).into())
}
})
.collect()
}
Err(e) => Err(InputError::FieldReadError(format!("Error reading artist ids: {e}")).into()),
}
}
/// Validate the album id in a multipart field
/// Expects a field with an album id, and ensures it is a valid album id in the database
#[cfg(feature = "ssr")]
async fn validate_album_id(album_id: Field<'static>) -> BackendResult<Option<i32>> {
use diesel::result::Error::NotFound;
// Extract the album id from the field
match album_id.text().await {
Ok(album_id) => {
if album_id.is_empty() {
return Ok(None);
}
// Parse the album id as an integer
if let Ok(album_id) = album_id.parse::<i32>() {
let mut db_conn = BackendState::get().await?.get_db_conn()?;
// Check if the album exists
let album = crate::schema::albums::dsl::albums
.find(album_id)
.first::<backend::Album>(&mut db_conn);
match album {
Ok(_) => Ok(Some(album_id)),
Err(NotFound) => Err(AccessError::NotFound.context("Album not found")),
Err(e) => Err(e.context("Error finding album id")),
}
} else {
Err(InputError::InvalidInput("Error parsing album id".to_string()).into())
}
}
Err(e) => Err(InputError::FieldReadError(format!("Error reading album id: {e}")).into()),
}
}
/// Validate the track number in a multipart field
/// Expects a field with a track number, and ensures it is a valid track number (non-negative integer)
#[cfg(feature = "ssr")]
async fn validate_track_number(track_number: Field<'static>) -> BackendResult<Option<i32>> {
match track_number.text().await {
Ok(track_number) => {
if track_number.is_empty() {
return Ok(None);
}
if let Ok(track_number) = track_number.parse::<i32>() {
if track_number < 0 {
Err(
InputError::InvalidInput("Track number must be positive or 0".to_string())
.into(),
)
} else {
Ok(Some(track_number))
}
} else {
Err(InputError::InvalidInput("Error parsing track number".to_string()).into())
}
}
Err(e) => {
Err(InputError::FieldReadError(format!("Error reading track number: {e}")).into())
}
}
}
/// Validate the release date in a multipart field
/// Expects a field with a release date, and ensures it is a valid date in the format [year]-[month]-[day]
#[cfg(feature = "ssr")]
async fn validate_release_date(release_date: Field<'static>) -> BackendResult<Option<NaiveDate>> {
match release_date.text().await {
Ok(release_date) => {
if release_date.trim().is_empty() {
return Ok(None);
}
let release_date = NaiveDate::parse_from_str(release_date.trim(), "%Y-%m-%d");
match release_date {
Ok(release_date) => Ok(Some(release_date)),
Err(_) => Err(InputError::InvalidInput(
"Invalid release date format, expected YYYY-MM-DD".to_string(),
)
.into()),
}
}
Err(e) => Err(InputError::InvalidInput(format!("Error reading release date: {e}")).into()),
}
}
/// Handle the file upload form
#[api_fn(upload, endpoint = "/upload")]
pub async fn upload(
data: MultipartData,
db_conn: &mut PgPooledConn,
state: BackendState,
) -> BackendResult<()> {
// Safe to unwrap - "On the server side, this always returns Some(_). On the client side, always returns None."
let mut data = data.into_inner().unwrap();
let mut title = None;
let mut artist_ids = None;
let mut album_id = None;
let mut track = None;
let mut release_date = None;
let mut file_name = None;
let mut duration = None;
// Fetch the fields from the form data
while let Ok(Some(mut field)) = data.next_field().await {
let name = field.name().unwrap_or_default().to_string();
match name.as_str() {
"title" => {
title = Some(
extract_field(field)
.await
.context("Error extracting title field")?,
);
}
"artist_ids" => {
artist_ids = Some(
validate_artist_ids(field)
.await
.context("Error validating artist ids")?,
);
}
"album_id" => {
album_id = Some(
validate_album_id(field)
.await
.context("Error validating album id")?,
);
}
"track_number" => {
track = Some(
validate_track_number(field)
.await
.context("Error validating track number")?,
);
}
"release_date" => {
release_date = Some(
validate_release_date(field)
.await
.context("Error validating release date")?,
);
}
"file" => {
use crate::util::audio::extract_metadata;
use std::fs::OpenOptions;
use std::io::{Seek, Write};
use symphonia::core::codecs::CODEC_TYPE_MP3;
// Some logging is done here where there is high potential for bugs / failures,
// or behavior that we may wish to change in the future
let asset_type = AssetType::Audio;
let relative_path = asset_type.new_path("mp3");
file_name = Some(relative_path.clone());
let full_path = state
.get_asset_path(&asset_type)
.join(relative_path.clone().path());
let parent_path =
full_path
.parent()
.ok_or(BackendError::InternalError(format!(
"Unable to get parent of path {}",
full_path.display()
)))?;
fs::create_dir_all(parent_path)
.context("Failed to create parent directories for new song")?;
debug!("Saving uploaded file {}", full_path.display());
// Save file to disk
// Use these open options to create the file, write to it, then read from it
let mut file = OpenOptions::new()
.read(true)
.write(true)
.create(true)
.truncate(true)
.open(full_path)
.context("Error opening file for upload")?;
while let Some(chunk) = field.chunk().await.map_err(|e| {
InputError::FieldReadError(format!("Error reading file chunk: {e}"))
})? {
file.write_all(&chunk)
.context("Error writing field chunk to file")?;
}
file.flush().context("Error flusing file")?;
// Rewind the file so the duration can be measured
file.rewind().context("Error rewinding file")?;
// Get the codec and duration of the file
let (file_codec, file_duration) = extract_metadata(file)
.context("Error extracting metadata from uploaded file")?;
if file_codec != CODEC_TYPE_MP3 {
return Err(InputError::InvalidInput(format!(
"Invalid uploaded audio file codec: {file_codec}"
))
.into());
}
duration = Some(file_duration);
}
_ => {
warn!("Unknown file upload field: {}", name);
}
}
}
// Unwrap mandatory fields
let title = title.ok_or(InputError::MissingField("title".to_string()))?;
let artist_ids = artist_ids.unwrap_or(vec![]);
let file_name = file_name.ok_or(InputError::MissingField("file".to_string()))?;
let duration = duration.ok_or(InputError::MissingField("duration".to_string()))?;
let duration = i32::try_from(duration)
.map_err(|e| InputError::InvalidInput(format!("Error parsing duration: {e}")))
.context("Error converting duration to i32")?;
let album_id = album_id.unwrap_or(None);
let track = track.unwrap_or(None);
let release_date = release_date.unwrap_or(None);
if album_id.is_some() != track.is_some() {
return Err(InputError::InvalidInput(
"Album id and track number must both be present or both be absent".to_string(),
)
.into());
}
// Create the song
let song = backend::NewSong {
title,
album_id,
track,
duration,
release_date,
storage_path: file_name,
image_path: None,
};
// Save the song to the database
let song = song
.insert_into(songs::table)
.get_result::<backend::Song>(db_conn)
.context("Error adding song to database")?;
// Save the song's artists to the database
let artist_ids = artist_ids
.into_iter()
.map(|artist_id| {
(
song_artists::song_id.eq(song.id),
song_artists::artist_id.eq(artist_id),
)
})
.collect::<Vec<_>>();
diesel::insert_into(song_artists::table)
.values(&artist_ids)
.execute(db_conn)
.context("Error saving song artists to database")?;
Ok(())
}

150
src/api/users.rs Normal file
View File

@@ -0,0 +1,150 @@
use crate::prelude::*;
cfg_if! {
if #[cfg(feature = "ssr")] {
use pbkdf2::{
password_hash::{
rand_core::OsRng,
PasswordHasher, PasswordHash, SaltString, PasswordVerifier, Error
},
Pbkdf2
};
}
}
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct UserCredentials {
pub username_or_email: String,
pub password: String,
}
/// Get a user from the database by username or email
/// Returns a Result with the user if found, None if not found, or an error if there was a problem
#[cfg(feature = "ssr")]
pub async fn find_user(
username_or_email: String,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::User>> {
// Look for either a username or email that matches the input, and return an option with None if no user is found
let user = users::table
.filter(users::username.eq(username_or_email.clone()))
.or_filter(users::email.eq(username_or_email))
.first::<backend::User>(db_conn)
.optional()
.context("Error loading user from database")?;
Ok(user)
}
/// Get a user from the database by ID
/// Returns a Result with the user if found, None if not found, or an error if there was a problem
#[cfg(feature = "ssr")]
pub async fn find_user_by_id(
user_id: i32,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::User>> {
let user = users::table
.filter(users::id.eq(user_id))
.first::<backend::User>(db_conn)
.optional()
.context("Error loading user from database")?;
Ok(user)
}
/// Create a new user in the database
/// Returns an empty Result if successful, or an error if there was a problem
#[cfg(feature = "ssr")]
pub async fn create_user(new_user: &backend::NewUser) -> BackendResult<()> {
let new_password =
new_user
.password
.clone()
.ok_or(BackendError::InputError(InputError::MissingField(
"password".to_string(),
)))?;
let salt = SaltString::generate(&mut OsRng);
let password_hash = Pbkdf2
.hash_password(new_password.as_bytes(), &salt)
.map_err(|e| AuthError::AuthError(format!("Error hashing password: {e}")))?
.to_string();
let new_user = backend::NewUser {
password: Some(password_hash),
..new_user.clone()
};
let mut db_conn = BackendState::get().await?.get_db_conn()?;
diesel::insert_into(users::table)
.values(&new_user)
.execute(&mut db_conn)
.context("Error inserting new user into database")?;
Ok(())
}
/// Validate a user's credentials
/// Returns a Result with the user if the credentials are valid, None if not valid, or an error if there was a problem
#[cfg(feature = "ssr")]
pub async fn validate_user(
credentials: UserCredentials,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::User>> {
let db_user = find_user(credentials.username_or_email.clone(), db_conn)
.await
.context("Error finding user in database")?;
// If the user is not found, return None
let db_user = match db_user {
Some(user) => user,
None => return Ok(None),
};
let db_password = db_user.password.clone().ok_or(AuthError::AuthError(
"No password stored for user".to_string(),
))?;
let password_hash = PasswordHash::new(&db_password)
.map_err(|e| AuthError::AuthError(format!("{e}")))
.context("Error parsing password hash from database")?;
match Pbkdf2.verify_password(credentials.password.as_bytes(), &password_hash) {
Ok(()) => {}
Err(Error::Password) => {
return Ok(None);
}
Err(e) => {
return Err(
AuthError::AuthError(format!("{e}")).context("Error verifying password hash")
);
}
}
Ok(Some(db_user))
}
/// Get a user from the database by username or email
/// Returns a Result with the user if found, None if not found, or an error if there was a problem
#[api_fn(endpoint = "find_user")]
pub async fn get_user(
username_or_email: String,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::User>> {
find_user(username_or_email, db_conn)
.await
.context("Error finding user by username or email")
.map(|result| result.map(|user| user.into()))
}
#[api_fn(endpoint = "get_user_by_id")]
pub async fn get_user_by_id(
user_id: i32,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<frontend::User>> {
find_user_by_id(user_id, db_conn)
.await
.context("Error finding user by ID")
.map(|result| result.map(|user| user.into()))
}

View File

@@ -1,15 +1,28 @@
use crate::playbar::PlayBar;
use crate::playbar::CustomTitle;
use crate::queue::Queue;
use leptos::*;
use crate::components::error_template::AppError;
use crate::prelude::*;
use leptos_meta::*;
use leptos_router::components::*;
use leptos_router::*;
use crate::pages::login::*;
use crate::pages::signup::*;
use crate::pages::profile::*;
use crate::pages::albumpage::*;
use crate::error_template::{AppError, ErrorTemplate};
use crate::util::state::GlobalState;
pub fn shell(options: LeptosOptions) -> impl IntoView {
view! {
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8"/>
<meta name="viewport" content="width=device-width, initial-scale=1"/>
<AutoReload options=options.clone() />
<HydrationScripts options=options.clone() />
<HashedStylesheet id="leptos" options />
<MetaTags/>
</head>
<body>
<App/>
</body>
</html>
}
}
#[component]
pub fn App() -> impl IntoView {
@@ -18,62 +31,68 @@ pub fn App() -> impl IntoView {
provide_context(GlobalState::new());
let upload_open = create_rw_signal(false);
let upload_open = RwSignal::new(false);
let add_artist_open = RwSignal::new(false);
let add_album_open = RwSignal::new(false);
view! {
// injects a stylesheet into the document <head>
// id=leptos means cargo-leptos will hot-reload this stylesheet
<Stylesheet id="leptos" href="/pkg/libretunes.css"/>
// sets the document title
<CustomTitle />
// content for this welcome page
<Router fallback=|| {
let mut outside_errors = Errors::default();
outside_errors.insert_with_default_key(AppError::NotFound);
view! {
<ErrorTemplate outside_errors/>
}
.into_view()
}>
<Router>
<main>
<Routes>
<Route path="" view=move || view! { <HomePage upload_open=upload_open/> }>
<Route path="" view=Dashboard />
<Route path="dashboard" view=Dashboard />
<Route path="search" view=Search />
<Route path="user/:id" view=Profile />
<Route path="user" view=Profile />
<Route path="album/:id" view=AlbumPage />
</Route>
<Route path="/login" view=Login />
<Route path="/signup" view=Signup />
<Routes fallback=|| {
let mut outside_errors = Errors::default();
outside_errors.insert_with_default_key(AppError::NotFound);
view! {
<ErrorTemplate outside_errors/>
}
.into_view()
}>
<ParentRoute path=path!("") view=move || view! { <HomePage upload_open=upload_open add_artist_open=add_artist_open add_album_open=add_album_open/> }>
<Route path=path!("") view=Dashboard />
<Route path=path!("dashboard") view=Dashboard />
<Route path=path!("search") view=Search />
<Route path=path!("user/:id") view=ProfilePage />
<Route path=path!("user") view=ProfilePage />
<Route path=path!("album/:id") view=AlbumPage />
<Route path=path!("artist/:id") view=ArtistPage />
<Route path=path!("song/:id") view=SongPage />
<Route path=path!("playlist/:id") view=PlaylistPage />
<Route path=path!("liked") view=LikedSongsPage />
</ParentRoute>
<Route path=path!("/login") view=Login />
<Route path=path!("/signup") view=Signup />
</Routes>
</main>
</Router>
}
}
use crate::components::sidebar::*;
use crate::components::dashboard::*;
use crate::components::search::*;
use crate::components::personal::Personal;
use crate::components::upload::*;
/// Renders the home page of your application.
#[component]
fn HomePage(upload_open: RwSignal<bool>) -> impl IntoView {
fn HomePage(
upload_open: RwSignal<bool>,
add_artist_open: RwSignal<bool>,
add_album_open: RwSignal<bool>,
) -> impl IntoView {
view! {
<div class="home-container">
<section class="bg-black h-screen flex">
<Upload open=upload_open/>
<Sidebar upload_open=upload_open/>
<AddArtist open=add_artist_open/>
<AddAlbum open=add_album_open/>
<Sidebar upload_open=upload_open add_artist_open=add_artist_open add_album_open=add_album_open/>
// This <Outlet /> will render the child route components
<Outlet />
<div class="flex flex-col flex-grow min-w-0">
<div class="home-card">
<Outlet />
</div>
</div>
<Personal />
<PlayBar />
<Queue />
</div>
<PlayBar {..} node_ref={GlobalState::playbar_element()} />
</section>
}
}

View File

@@ -1,34 +0,0 @@
use crate::components::dashboard_tile::DashboardTile;
use serde::{Serialize, Deserialize};
/// Holds information about an artist
///
/// Intended to be used in the front-end
#[derive(Clone, Serialize, Deserialize)]
pub struct ArtistData {
/// Artist id
pub id: i32,
/// Artist name
pub name: String,
/// Path to artist image, relative to the root of the web server.
/// For example, `"/assets/images/Artist.jpg"`
pub image_path: String,
}
impl DashboardTile for ArtistData {
fn image_path(&self) -> String {
self.image_path.clone()
}
fn title(&self) -> String {
self.name.clone()
}
fn link(&self) -> String {
format!("/artist/{}", self.id)
}
fn description(&self) -> Option<String> {
Some("Artist".to_string())
}
}

View File

@@ -1,201 +0,0 @@
use leptos::*;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos::server_fn::error::NoCustomError;
use leptos_axum::extract;
use axum_login::AuthSession;
use crate::auth_backend::AuthBackend;
}
}
use crate::models::User;
use crate::users::UserCredentials;
/// Create a new user and log them in
/// Takes in a NewUser struct, with the password in plaintext
/// Returns a Result with the error message if the user could not be created
#[server(endpoint = "signup")]
pub async fn signup(new_user: User) -> Result<(), ServerFnError> {
// Check LIBRETUNES_DISABLE_SIGNUP env var
if std::env::var("LIBRETUNES_DISABLE_SIGNUP").is_ok_and(|v| v == "true") {
return Err(ServerFnError::<NoCustomError>::ServerError("Signup is disabled".to_string()));
}
use crate::users::create_user;
// Ensure the user has no id, and is not a self-proclaimed admin
let new_user = User {
id: None,
admin: false,
..new_user
};
create_user(&new_user).await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error creating user: {}", e)))?;
let mut auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
let credentials = UserCredentials {
username_or_email: new_user.username.clone(),
password: new_user.password.clone().unwrap()
};
match auth_session.authenticate(credentials).await {
Ok(Some(user)) => {
auth_session.login(&user).await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error logging in user: {}", e)))
},
Ok(None) => {
Err(ServerFnError::<NoCustomError>::ServerError("Error authenticating user: User not found".to_string()))
},
Err(e) => {
Err(ServerFnError::<NoCustomError>::ServerError(format!("Error authenticating user: {}", e)))
}
}
}
/// Log a user in
/// Takes in a username or email and a password in plaintext
/// Returns a Result with a boolean indicating if the login was successful
#[server(endpoint = "login")]
pub async fn login(credentials: UserCredentials) -> Result<Option<User>, ServerFnError> {
use crate::users::validate_user;
let mut auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
let user = validate_user(credentials).await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error validating user: {}", e)))?;
if let Some(mut user) = user {
auth_session.login(&user).await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error logging in user: {}", e)))?;
user.password = None;
Ok(Some(user))
} else {
Ok(None)
}
}
/// Log a user out
/// Returns a Result with the error message if the user could not be logged out
#[server(endpoint = "logout")]
pub async fn logout() -> Result<(), ServerFnError> {
let mut auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
auth_session.logout().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
leptos_axum::redirect("/login");
Ok(())
}
/// Check if a user is logged in
/// Returns a Result with a boolean indicating if the user is logged in
#[server(endpoint = "check_auth")]
pub async fn check_auth() -> Result<bool, ServerFnError> {
let auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
Ok(auth_session.user.is_some())
}
/// Require that a user is logged in
/// Returns a Result with the error message if the user is not logged in
/// Intended to be used at the start of a protected route, to ensure the user is logged in:
/// ```rust
/// use leptos::*;
/// use libretunes::auth::require_auth;
/// #[server(endpoint = "protected_route")]
/// pub async fn protected_route() -> Result<(), ServerFnError> {
/// require_auth().await?;
/// // Continue with protected route
/// Ok(())
/// }
/// ```
#[cfg(feature = "ssr")]
pub async fn require_auth() -> Result<(), ServerFnError> {
check_auth().await.and_then(|logged_in| {
if logged_in {
Ok(())
} else {
Err(ServerFnError::<NoCustomError>::ServerError(format!("Unauthorized")))
}
})
}
/// Get the current logged-in user
/// Returns a Result with the user if they are logged in
/// Returns an error if the user is not logged in, or if there is an error getting the user
/// Intended to be used in a route to get the current user:
/// ```rust
/// use leptos::*;
/// use libretunes::auth::get_user;
/// #[server(endpoint = "user_route")]
/// pub async fn user_route() -> Result<(), ServerFnError> {
/// let user = get_user().await?;
/// println!("Logged in as: {}", user.username);
/// // Do something with the user
/// Ok(())
/// }
/// ```
#[cfg(feature = "ssr")]
pub async fn get_user() -> Result<User, ServerFnError> {
let auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
auth_session.user.ok_or(ServerFnError::<NoCustomError>::ServerError("User not logged in".to_string()))
}
#[server(endpoint = "get_logged_in_user")]
pub async fn get_logged_in_user() -> Result<Option<User>, ServerFnError> {
let auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
let user = auth_session.user.map(|mut user| {
user.password = None;
user
});
Ok(user)
}
/// Check if a user is an admin
/// Returns a Result with a boolean indicating if the user is logged in and an admin
#[server(endpoint = "check_admin")]
pub async fn check_admin() -> Result<bool, ServerFnError> {
let auth_session = extract::<AuthSession<AuthBackend>>().await
.map_err(|e| ServerFnError::<NoCustomError>::ServerError(format!("Error getting auth session: {}", e)))?;
Ok(auth_session.user.as_ref().map(|u| u.admin).unwrap_or(false))
}
/// Require that a user is logged in and an admin
/// Returns a Result with the error message if the user is not logged in or is not an admin
/// Intended to be used at the start of a protected route, to ensure the user is logged in and an admin:
/// ```rust
/// use leptos::*;
/// use libretunes::auth::require_admin;
/// #[server(endpoint = "protected_admin_route")]
/// pub async fn protected_admin_route() -> Result<(), ServerFnError> {
/// require_admin().await?;
/// // Continue with protected route
/// Ok(())
/// }
/// ```
#[cfg(feature = "ssr")]
pub async fn require_admin() -> Result<(), ServerFnError> {
check_admin().await.and_then(|is_admin| {
if is_admin {
Ok(())
} else {
Err(ServerFnError::<NoCustomError>::ServerError(format!("Unauthorized")))
}
})
}

View File

@@ -1,48 +0,0 @@
use axum_login::{AuthnBackend, AuthUser, UserId};
use crate::users::UserCredentials;
use leptos::server_fn::error::ServerFnErrorErr;
use crate::models::User;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use async_trait::async_trait;
}
}
impl AuthUser for User {
type Id = i32;
// TODO: Ideally, we shouldn't have to unwrap here
fn id(&self) -> Self::Id {
self.id.unwrap()
}
fn session_auth_hash(&self) -> &[u8] {
self.password.as_ref().unwrap().as_bytes()
}
}
#[derive(Clone)]
pub struct AuthBackend;
#[cfg(feature = "ssr")]
#[async_trait]
impl AuthnBackend for AuthBackend {
type User = User;
type Credentials = UserCredentials;
type Error = ServerFnErrorErr;
async fn authenticate(&self, creds: Self::Credentials) -> Result<Option<Self::User>, Self::Error> {
crate::users::validate_user(creds).await
.map_err(|e| ServerFnErrorErr::ServerError(format!("Error validating user: {}", e)))
}
async fn get_user(&self, user_id: &UserId<Self>) -> Result<Option<Self::User>, Self::Error> {
crate::users::find_user_by_id(*user_id).await
.map_err(|e| ServerFnErrorErr::ServerError(format!("Error getting user: {}", e)))
}
}

View File

@@ -1,11 +0,0 @@
pub mod sidebar;
pub mod dashboard;
pub mod search;
pub mod personal;
pub mod dashboard_tile;
pub mod dashboard_row;
pub mod upload;
pub mod song_list;
pub mod loading;
pub mod error;
pub mod album_info;

View File

@@ -0,0 +1,89 @@
use crate::prelude::*;
#[component]
pub fn AddAlbumBtn(add_album_open: RwSignal<bool>) -> impl IntoView {
let open_dialog = move |_| {
add_album_open.set(true);
};
view! {
<button class="add-album-btn add-btns" on:click=open_dialog>
Add Album
</button>
}
}
#[component]
pub fn AddAlbum(open: RwSignal<bool>) -> impl IntoView {
let album_title = RwSignal::new("".to_string());
let release_date = RwSignal::new("".to_string());
let image_path = RwSignal::new("".to_string());
let close_dialog = move |ev: leptos::ev::MouseEvent| {
ev.prevent_default();
open.set(false);
};
let on_add_album = move |ev: leptos::ev::SubmitEvent| {
ev.prevent_default();
let album_title_clone = album_title.get();
let release_date_clone = Some(release_date.get());
let image_path_clone = Some(image_path.get());
spawn_local(async move {
let add_album_result =
api::albums::add_album(album_title_clone, release_date_clone, image_path_clone)
.await;
if let Err(err) = add_album_result {
leptos_log!("Error adding album: {:?}", err);
} else if let Ok(album) = add_album_result {
leptos_log!("Added album: {:?}", album);
album_title.set("".to_string());
release_date.set("".to_string());
image_path.set("".to_string());
}
});
};
view! {
<Show when=open fallback=move|| view!{}>
<div class="add-album-container">
<div class="upload-header">
<h1>Add Album</h1>
</div>
<div class="close-button" on:click=close_dialog><Icon icon={icondata::IoClose} /></div>
<form class="create-album-form" on:submit=on_add_album>
<div class="input-bx">
<input type="text" required class="text-input"
prop:value=album_title
on:input=move |ev: leptos::ev::Event| {
album_title.set(event_target_value(&ev));
}
/>
<span>Album Title</span>
</div>
<div class="release-date">
<div class="left">
<span>Release</span>
<span>Date</span>
</div>
<input class="info" type="date"
prop:value=release_date
on:input=move |ev: leptos::ev::Event| {
release_date.set(event_target_value(&ev));
}
/>
</div>
<div class="input-bx">
<input type="text" class="text-input"
prop:value=image_path
on:input=move |ev: leptos::ev::Event| {
image_path.set(event_target_value(&ev));
}
/>
<span>Image Path</span>
</div>
<button type="submit" class="upload-button">Add</button>
</form>
</div>
</Show>
}
}

View File

@@ -0,0 +1,58 @@
use crate::prelude::*;
#[component]
pub fn AddArtistBtn(add_artist_open: RwSignal<bool>) -> impl IntoView {
let open_dialog = move |_| {
add_artist_open.set(true);
};
view! {
<button class="add-artist-btn add-btns" on:click=open_dialog>
Add Artist
</button>
}
}
#[component]
pub fn AddArtist(open: RwSignal<bool>) -> impl IntoView {
let artist_name = RwSignal::new("".to_string());
let close_dialog = move |ev: leptos::ev::MouseEvent| {
ev.prevent_default();
open.set(false);
};
let on_add_artist = move |ev: leptos::ev::SubmitEvent| {
ev.prevent_default();
let artist_name_clone = artist_name.get();
spawn_local(async move {
let add_artist_result = api::artists::add_artist(artist_name_clone).await;
if let Err(err) = add_artist_result {
leptos_log!("Error adding artist: {:?}", err);
} else if let Ok(artist) = add_artist_result {
leptos_log!("Added artist: {:?}", artist);
artist_name.set("".to_string());
}
});
};
view! {
<Show when=open fallback=move|| view!{}>
<div class="add-artist-container">
<div class="upload-header">
<h1>Add Artist</h1>
</div>
<div class="close-button" on:click=close_dialog><Icon icon={icondata::IoClose} /></div>
<form class="create-artist-form" on:submit=on_add_artist>
<div class="input-bx">
<input type="text" name="title" required class="text-input"
prop:value=artist_name
on:input=move |ev: leptos::ev::Event| {
artist_name.set(event_target_value(&ev));
}
/>
<span>Artist Name</span>
</div>
<button type="submit" class="upload-button">Add</button>
</form>
</div>
</Show>
}
}

View File

@@ -1,25 +0,0 @@
use leptos::leptos_dom::*;
use leptos::*;
use crate::albumdata::AlbumData;
#[component]
pub fn AlbumInfo(albumdata: AlbumData) -> impl IntoView {
view! {
<div class="album-info">
<img class="album-image" src={albumdata.image_path} alt="dashboard-tile" />
<div class="album-body">
<p class="album-title">{albumdata.title}</p>
<div class="album-artists">
{
albumdata.artists.iter().map(|artist| {
view! {
<a class="album-artist" href={format!("/artist/{}", artist.id.unwrap())}>{artist.name.clone()}</a>
}
}).collect::<Vec<_>>()
}
</div>
</div>
</div>
}.into_view()
}

View File

@@ -0,0 +1,29 @@
use crate::prelude::*;
/// Displays a song's artists, with links to their artist pages
#[component]
pub fn ArtistList(artists: Vec<backend::Artist>) -> impl IntoView {
let num_artists = artists.len() as isize;
artists
.iter()
.enumerate()
.map(|(i, artist)| {
let i = i as isize;
view! {
<a class="hover:underline active:text-controls-active"
href={format!("/artist/{}", artist.id)}>{artist.name.clone()}</a>
{
use std::cmp::Ordering;
match i.cmp(&(num_artists - 2)) {
Ordering::Less => ", ",
Ordering::Equal => " & ",
Ordering::Greater => "",
}
}
}
})
.collect::<Vec<_>>()
}

View File

@@ -1,10 +0,0 @@
use leptos::*;
#[component]
pub fn Dashboard() -> impl IntoView {
view! {
<div class="dashboard-container home-component">
<h1 class="dashboard-header">Dashboard</h1>
</div>
}
}

View File

@@ -1,118 +1,117 @@
use crate::prelude::*;
use leptos::html::Ul;
use leptos::leptos_dom::*;
use leptos::*;
use leptos_use::{use_element_size, UseElementSizeReturn, use_scroll, UseScrollReturn};
use crate::components::dashboard_tile::DashboardTile;
use leptos_icons::*;
/// A row of dashboard tiles, with a title
pub struct DashboardRow {
pub title: String,
pub tiles: Vec<Box<dyn DashboardTile>>,
}
#[component]
pub fn DashboardRow(
#[prop(into)] title: TextProp,
#[prop(default=vec![])] tiles: Vec<DashboardTile>,
) -> impl IntoView {
let list_ref = NodeRef::<Ul>::new();
impl DashboardRow {
pub fn new(title: String, tiles: Vec<Box<dyn DashboardTile>>) -> Self {
Self {
title,
tiles,
}
}
}
// Scroll functions attempt to align the left edge of the scroll area with the left edge of a tile
// This is done by scrolling to the nearest multiple of the tile width, plus some for padding
impl IntoView for DashboardRow {
fn into_view(self) -> View {
let list_ref = create_node_ref::<Ul>();
let scroll_left = move |_| {
if let Some(scroll_element) = list_ref.get_untracked() {
let client_width = scroll_element.client_width() as f64;
let current_pos = scroll_element.scroll_left() as f64;
let desired_pos = current_pos - client_width;
// Scroll functions attempt to align the left edge of the scroll area with the left edge of a tile
// This is done by scrolling to the nearest multiple of the tile width, plus some for padding
if let Some(first_tile) = scroll_element.first_element_child() {
let tile_width = first_tile.client_width() as f64;
let scroll_pos = desired_pos + (tile_width - (desired_pos % tile_width));
scroll_element.scroll_to_with_x_and_y(scroll_pos, 0.0);
} else {
leptos_warn!("Could not get first tile to scroll left");
// Fall back to scrolling by the client width if we can't get the tile width
scroll_element.scroll_to_with_x_and_y(desired_pos, 0.0);
}
} else {
leptos_warn!("Could not get scroll element to scroll left");
}
};
let scroll_left = move |_| {
if let Some(scroll_element) = list_ref.get_untracked() {
let client_width = scroll_element.client_width() as f64;
let current_pos = scroll_element.scroll_left() as f64;
let desired_pos = current_pos - client_width;
let scroll_right = move |_| {
if let Some(scroll_element) = list_ref.get_untracked() {
let client_width = scroll_element.client_width() as f64;
let current_pos = scroll_element.scroll_left() as f64;
let desired_pos = current_pos + client_width;
if let Some(first_tile) = scroll_element.first_element_child() {
let tile_width = first_tile.client_width() as f64;
let scroll_pos = desired_pos + (tile_width - (desired_pos % tile_width));
scroll_element.scroll_to_with_x_and_y(scroll_pos, 0.0);
} else {
warn!("Could not get first tile to scroll left");
// Fall back to scrolling by the client width if we can't get the tile width
scroll_element.scroll_to_with_x_and_y(desired_pos, 0.0);
}
} else {
warn!("Could not get scroll element to scroll left");
}
};
if let Some(first_tile) = scroll_element.first_element_child() {
let tile_width = first_tile.client_width() as f64;
let scroll_pos = desired_pos - (desired_pos % tile_width);
scroll_element.scroll_to_with_x_and_y(scroll_pos, 0.0);
} else {
leptos_warn!("Could not get first tile to scroll right");
// Fall back to scrolling by the client width if we can't get the tile width
scroll_element.scroll_to_with_x_and_y(desired_pos, 0.0);
}
} else {
leptos_warn!("Could not get scroll element to scroll right");
}
};
let scroll_right = move |_| {
if let Some(scroll_element) = list_ref.get_untracked() {
let client_width = scroll_element.client_width() as f64;
let current_pos = scroll_element.scroll_left() as f64;
let desired_pos = current_pos + client_width;
if let Some(first_tile) = scroll_element.first_element_child() {
let tile_width = first_tile.client_width() as f64;
let scroll_pos = desired_pos - (desired_pos % tile_width);
scroll_element.scroll_to_with_x_and_y(scroll_pos, 0.0);
} else {
warn!("Could not get first tile to scroll right");
// Fall back to scrolling by the client width if we can't get the tile width
scroll_element.scroll_to_with_x_and_y(desired_pos, 0.0);
}
} else {
warn!("Could not get scroll element to scroll right");
}
};
let UseElementSizeReturn {
width: scroll_element_width,
..
} = use_element_size(list_ref);
let UseScrollReturn { x: scroll_x, .. } = use_scroll(list_ref);
let UseElementSizeReturn { width: scroll_element_width, .. } = use_element_size(list_ref);
let UseScrollReturn { x: scroll_x, .. } = use_scroll(list_ref);
let scroll_right_hidden = Signal::derive(move || {
if let Some(scroll_element) = list_ref.get() {
if scroll_element.scroll_width() as f64 - scroll_element_width.get() <= scroll_x.get() {
"visibility: hidden"
} else {
""
}
} else {
""
}
});
let scroll_right_hidden = Signal::derive(move || {
if let Some(scroll_element) = list_ref.get() {
if scroll_element.scroll_width() as f64 - scroll_element_width.get() <= scroll_x.get() {
"visibility: hidden"
} else {
""
}
} else {
""
}
});
let scroll_left_hidden = Signal::derive(move || {
if scroll_x.get() <= 0.0 {
"visibility: hidden"
} else {
""
}
});
let scroll_left_hidden = Signal::derive(move || {
if scroll_x.get() <= 0.0 {
"visibility: hidden"
} else {
""
}
});
view! {
<div class="dashboard-tile-row">
<div class="dashboard-tile-row-title-row">
<h2>{self.title}</h2>
<div class="dashboard-tile-row-scroll-btn">
<button on:click=scroll_left tabindex=-1 style=scroll_left_hidden>
<Icon class="dashboard-tile-row-scroll" icon=icondata::FiChevronLeft />
</button>
<button on:click=scroll_right tabindex=-1 style=scroll_right_hidden>
<Icon class="dashboard-tile-row-scroll" icon=icondata::FiChevronRight />
</button>
</div>
view! {
<div>
<div class="flex">
<h2 class="text-xl font-bold">{move || title.get()}</h2>
<div class="m-auto mr-0">
<button class="control" on:click=scroll_left tabindex=-1 style=scroll_left_hidden>
<Icon icon={icondata::FiChevronLeft} {..} class="w-7 h-7" />
</button>
<button class="control" on:click=scroll_right tabindex=-1 style=scroll_right_hidden>
<Icon icon={icondata::FiChevronRight} {..} class="w-7 h-7" />
</button>
</div>
<ul _ref={list_ref}>
{self.tiles.into_iter().map(|tile_info| {
view! {
<li>
{ tile_info.into_view() }
</li>
}
}).collect::<Vec<_>>()}
</ul>
</div>
}.into_view()
}
<ul class="flex overflow-x-hidden scroll-smooth ps-0"
style="mask-image: linear-gradient(90deg, black, 95%, transparent);
-webkit-mask-image: linear-gradient(90deg, black, 95%, transparent);" node_ref={list_ref}>
{tiles.into_iter().map(|tile| {
view! {
<li>
<div class="mr-2.5">
<a href={move || tile.link.get()}>
<img class="w-50 h-50 max-w-none rounded-md mr-5"
src={move || tile.image_path.get()} alt="dashboard-tile" />
<p class="text-lg font-semibold">{move || tile.title.get()}</p>
<p>
{move || tile.description.as_ref().map(|desc| desc.get())}
</p>
</a>
</div>
</li>
}
}).collect::<Vec<_>>()}
</ul>
</div>
}.into_view()
}

View File

@@ -1,27 +1,13 @@
use leptos::leptos_dom::*;
use leptos::*;
use crate::prelude::*;
pub trait DashboardTile {
fn image_path(&self) -> String;
fn title(&self) -> String;
fn link(&self) -> String;
fn description(&self) -> Option<String> { None }
}
impl IntoView for &dyn DashboardTile {
fn into_view(self) -> View {
let link = self.link();
view! {
<div class="dashboard-tile">
<a href={link}>
<img src={self.image_path()} alt="dashboard-tile" />
<p class="dashboard-tile-title">{self.title()}</p>
<p class="dashboard-tile-description">
{self.description().unwrap_or_default()}
</p>
</a>
</div>
}.into_view()
}
#[slot]
pub struct DashboardTile {
#[prop(into)]
image_path: TextProp,
#[prop(into)]
title: TextProp,
#[prop(into)]
link: TextProp,
#[prop(into, optional)]
description: Option<TextProp>,
}

View File

@@ -1,45 +1,39 @@
use leptos::*;
use leptos_icons::*;
use crate::prelude::*;
use std::fmt::Display;
#[component]
pub fn ServerError<E: Display + 'static>(
#[prop(optional, into, default="An Error Occurred".into())]
title: TextProp,
#[prop(optional, into)]
message: TextProp,
#[prop(optional, into)]
error: Option<ServerFnError<E>>,
#[prop(optional, into, default="An Error Occurred".into())] title: TextProp,
#[prop(optional, into)] message: TextProp,
#[prop(optional, into)] error: Option<ServerFnError<E>>,
) -> impl IntoView {
view!{
<div class="error-container">
<div class="error-header">
<Icon icon=icondata::BiErrorSolid />
<h1>{title}</h1>
</div>
<p>{message}</p>
<p>{error.map(|error| format!("{}", error))}</p>
</div>
}
view! {
<div class="error-container">
<div class="error-header">
<Icon icon={icondata::BiErrorSolid} />
<h1>{move || title.get()}</h1>
</div>
<p>{move || message.get()}</p>
<p>{error.map(|error| format!("{error}"))}</p>
</div>
}
}
#[component]
pub fn Error<E: Display + 'static>(
#[prop(optional, into, default="An Error Occurred".into())]
title: TextProp,
#[prop(optional, into)]
message: TextProp,
#[prop(optional, into)]
error: Option<E>,
#[prop(optional, into, default="An Error Occurred".into())] title: TextProp,
#[prop(optional, into)] message: TextProp,
#[prop(optional, into)] error: Option<E>,
) -> impl IntoView {
view! {
<div class="error-container">
<div class="error-header">
<Icon icon=icondata::BiErrorSolid />
<h1>{title}</h1>
</div>
<p>{message}</p>
<p>{error.map(|error| format!("{}", error))}</p>
</div>
}
view! {
<div class="text-red-800">
<div class="grid grid-cols-[max-content_1fr] gap-1">
<Icon icon={icondata::BiErrorSolid} {..} class="self-center" />
<h1 class="self-center">{move || title.get()}</h1>
</div>
<p>{move || message.get()}</p>
<p>{error.map(|error| format!("{error}"))}</p>
</div>
}
}

View File

@@ -1,5 +1,6 @@
use crate::prelude::*;
use http::status::StatusCode;
use leptos::*;
use thiserror::Error;
#[cfg(feature = "ssr")]
@@ -12,7 +13,7 @@ pub enum AppError {
}
impl AppError {
pub fn status_code(&self) -> StatusCode {
pub const fn status_code(&self) -> StatusCode {
match self {
AppError::NotFound => StatusCode::NOT_FOUND,
}
@@ -27,7 +28,7 @@ pub fn ErrorTemplate(
#[prop(optional)] errors: Option<RwSignal<Errors>>,
) -> impl IntoView {
let errors = match outside_errors {
Some(e) => create_rw_signal(e),
Some(e) => RwSignal::new(e),
None => match errors {
Some(e) => e,
None => panic!("No Errors found and we expected errors!"),
@@ -51,7 +52,7 @@ pub fn ErrorTemplate(
response.set_status(errors[0].status_code());
}
}
view! {
<h1>{if errors.len() > 1 {"Errors"} else {"Error"}}</h1>
<For

View File

@@ -0,0 +1,33 @@
use crate::prelude::*;
#[component]
pub fn FancyInput(
#[prop(into)] label: TextProp,
#[prop(optional, into)] password: Signal<bool>,
#[prop(optional)] required: bool,
#[prop(optional)] value: RwSignal<String>,
) -> impl IntoView {
view! {
<div class="relative mt-12 mb-3">
<input
class="peer text-lg w-full relative p-1 z-20 border-none outline-none bg-transparent text-white"
type={move || if password.get() { "password" } else { "text" }}
required={required}
placeholder=""
bind:value={value}
/>
<span
class="absolute left-0 text-lg transition-all duration-500
text-lg peer-[:not(:placeholder-shown)]:text-base peer-focus:text-base
text-black peer-[:not(:placeholder-shown)]:text-neutral-700 peer-focus:text-neutral-700;
peer-[:not(:placeholder-shown)]:translate-y-[-30px] peer-focus:translate-y-[-30px]"
>
{label.get()}
</span>
<div
class="w-full h-[2px] rounded-md bg-accent-light absolute bottom-0 left-0
transition-all duration-500 peer-[:not(:placeholder-shown)]:h-10 peer-focus:h-10"
></div>
</div>
}
}

View File

@@ -0,0 +1,26 @@
use crate::prelude::*;
#[component]
pub fn LoadResource<T, S, C, V>(
resource: Resource<BackendResult<T>, S>,
children: C,
) -> impl IntoView
where
T: Send + Sync + Clone + 'static,
S: Send + 'static,
C: Fn(T) -> V + Send + 'static,
V: IntoView + 'static,
{
view! {
<Transition
fallback=move || view! { <Loading /> }
>
{move || resource.get().map(|resource|
match resource {
Ok(resource) => Either::Left(children(resource)),
Err(err) => Either::Right(err.to_component()),
}
)}
</Transition>
}
}

View File

@@ -1,19 +1,26 @@
use leptos::*;
use crate::prelude::*;
/// A loading indicator
#[component]
pub fn Loading() -> impl IntoView {
view! {
<div class="loading"></div>
}
let dots_style = "h-2 w-2 bg-accent rounded-full animate-pulse";
view! {
<div class="flex space-x-1 justify-center items-center my-2">
<span class="sr-only">"Loading..."</span>
<div class=dots_style style="animation-duration: 900ms; animation-delay: 0ms;" />
<div class=dots_style style="animation-duration: 900ms; animation-delay: 300ms"/>
<div class=dots_style style="animation-duration: 900ms; animation-delay: 600ms;" />
</div>
}
}
/// A full page, centered loading indicator
#[component]
pub fn LoadingPage() -> impl IntoView {
view!{
<div class="loading-page">
<Loading />
</div>
}
view! {
<div class="loading-page">
<Loading />
</div>
}
}

96
src/components/menu.rs Normal file
View File

@@ -0,0 +1,96 @@
use crate::prelude::*;
#[derive(Clone, Copy, PartialEq, Eq)]
pub enum MenuEntry {
Dashboard,
Search,
}
impl MenuEntry {
pub const fn path(&self) -> &'static str {
match self {
MenuEntry::Dashboard => "/",
MenuEntry::Search => "/search",
}
}
pub const fn icon(&self) -> icondata::Icon {
match self {
MenuEntry::Dashboard => icondata::OcHomeFillLg,
MenuEntry::Search => icondata::BiSearchRegular,
}
}
pub const fn title(&self) -> &'static str {
match self {
MenuEntry::Dashboard => "Dashboard",
MenuEntry::Search => "Search",
}
}
pub const fn all() -> [MenuEntry; 2] {
[MenuEntry::Dashboard, MenuEntry::Search]
}
}
#[component]
pub fn MenuItem(entry: MenuEntry, #[prop(into)] active: Signal<bool>) -> impl IntoView {
view! {
<a class="menu-btn" href={entry.path().to_string()}
style={move || if active() {"color: var(--color-menu-active);"} else {""}}
>
<Icon height="1.7rem" width="1.7rem" icon={entry.icon()} {..} class="mr-2" />
<h2>{entry.title()}</h2>
</a>
}
}
#[component]
pub fn Menu(
upload_open: RwSignal<bool>,
add_artist_open: RwSignal<bool>,
add_album_open: RwSignal<bool>,
) -> impl IntoView {
use leptos_router::hooks::use_location;
let location = use_location();
let active_entry = Signal::derive(move || {
let path = location.pathname.get();
MenuEntry::all()
.into_iter()
.find(|entry| entry.path() == path)
});
let dropdown_open = RwSignal::new(false);
view! {
<div class="home-card">
<Show
when=move || {upload_open.get() || add_artist_open.get() || add_album_open.get()}
fallback=move || view! {}
>
<div class="upload-overlay" on:click=move |_| {
upload_open.set(false);
add_artist_open.set(false);
add_album_open.set(false);
}></div>
</Show>
<div class="flex">
<h1 class="text-xl font-bold">"LibreTunes"</h1>
<div class="upload-dropdown-container">
<UploadDropdownBtn dropdown_open=dropdown_open/>
<Show
when= move || dropdown_open()
fallback=move || view! {}
>
<UploadDropdown dropdown_open=dropdown_open upload_open=upload_open add_artist_open=add_artist_open add_album_open=add_album_open/>
</Show>
</div>
</div>
{MenuEntry::all().into_iter().map(|entry| {
let active = Signal::derive(move || active_entry.get() == Some(entry));
view! { <MenuItem entry active /> }
}).collect::<Vec<_>>()}
</div>
}
}

43
src/components/mod.rs Normal file
View File

@@ -0,0 +1,43 @@
pub mod add_album;
pub mod add_artist;
pub mod artist_list;
pub mod dashboard_row;
pub mod dashboard_tile;
pub mod error;
pub mod error_template;
pub mod fancy_input;
pub mod load_resource;
pub mod loading;
pub mod menu;
pub mod personal;
pub mod playbar;
pub mod queue;
pub mod sidebar;
pub mod song;
pub mod songs;
pub mod upload;
pub mod upload_dropdown;
pub mod all {
use super::*;
pub use add_album::{AddAlbum, AddAlbumBtn};
pub use add_artist::{AddArtist, AddArtistBtn};
pub use artist_list::ArtistList;
pub use dashboard_row::DashboardRow;
pub use dashboard_tile::DashboardTile;
pub use error::{Error, ServerError};
pub use error_template::ErrorTemplate;
pub use fancy_input::FancyInput;
pub use load_resource::LoadResource;
pub use loading::{Loading, LoadingPage};
pub use menu::{Menu, MenuItem};
pub use personal::{DropDownLoggedIn, DropDownNotLoggedIn, Personal, Profile};
pub use playbar::{CustomTitle, PlayBar};
pub use queue::Queue;
pub use sidebar::{Playlists, Sidebar};
pub use song::Song;
pub use songs::all::*;
pub use upload::{Album, Artist, Upload, UploadBtn};
pub use upload_dropdown::{UploadDropdown, UploadDropdownBtn};
}

View File

@@ -1,13 +1,11 @@
use leptos::leptos_dom::*;
use leptos::*;
use leptos_icons::*;
use crate::auth::logout;
use crate::util::state::GlobalState;
use crate::prelude::*;
use leptos::html::Div;
#[component]
pub fn Personal() -> impl IntoView {
view! {
<div class=" personal-container">
<div class="home-card w-[250px] min-w-[250px]">
<Profile />
</div>
}
@@ -15,101 +13,94 @@ pub fn Personal() -> impl IntoView {
#[component]
pub fn Profile() -> impl IntoView {
let (dropdown_open, set_dropdown_open) = create_signal(false);
let user = GlobalState::logged_in_user();
let open_dropdown = move |_| {
set_dropdown_open.update(|value| *value = !*value);
};
let dropdown_open = RwSignal::new(false);
let user = GlobalState::logged_in_user();
let user_profile_picture = move || {
user.get().and_then(|user| {
if let Some(user) = user {
if user.id.is_none() {
return None;
}
Some(format!("/assets/images/profile/{}.webp", user.id.unwrap()))
} else {
None
}
})
};
let toggle_dropdown = move |_| dropdown_open.set(!dropdown_open.get());
let profile_photo = NodeRef::<Div>::new();
let dropdown = NodeRef::<Div>::new();
let _ = on_click_outside_with_options(
dropdown,
move |_| dropdown_open.set(false),
OnClickOutsideOptions::default().ignore(profile_photo),
);
let user_profile_picture = move || user.get().flatten().map(|user| user.image_path.path());
view! {
<div class="profile-container">
<div class="profile-name">
<div class="flex relative">
<div class="text-lg self-center">
<Suspense
fallback=|| view!{
<h1>Not Logged In</h1>
}>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view!{
<h1>Not Logged In</h1>
}>
<h1>{move || user.get().map(|user| user.map(|user| user.username))}</h1>
</Show>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view!{
<h1>Not Logged In</h1>
}>
<h1>{move || user.get().map(|user| user.map(|user| user.username))}</h1>
</Show>
</Suspense>
</div>
<div class="profile-icon" on:click=open_dropdown>
<Suspense fallback=|| view! { <Icon icon=icondata::CgProfile width="45" height="45"/> }>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view! { <Icon icon=icondata::CgProfile width="45" height="45"/> }
>
<object class="profile-image" data={user_profile_picture} type="image/webp">
<Icon class="profile-image" icon=icondata::CgProfile width="45" height="45"/>
</object>
</Show>
</Suspense>
</div>
<div class="dropdown-container" style={move || if dropdown_open() {"display: flex"} else {"display: none"}}>
<Suspense
fallback=|| view!{
<DropDownNotLoggedIn />
}>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view!{
<DropDownNotLoggedIn />
}>
<DropDownLoggedIn />
</Show>
</Suspense>
<div class="self-center hover:scale-105 transition-transform cursor-pointer ml-auto"
on:click=toggle_dropdown node_ref=profile_photo>
<Suspense fallback=|| view! { <Icon icon={icondata::CgProfile} width="45" height="45"/> }>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view! { <Icon icon={icondata::CgProfile} width="45" height="45"/> }
>
<object class="w-11 h-11 rounded-full pointer-events-none"
data={user_profile_picture} type="image/webp">
<Icon icon={icondata::CgProfile} width="45" height="45" {..} />
</object>
</Show>
</Suspense>
</div>
<Show when=dropdown_open >
<div class="absolute bg-bg-light rounded-lg border-2 border-neutral-700 top-12
right-3 p-1 text-right" node_ref=dropdown>
<Suspense
fallback=|| view!{
<DropDownNotLoggedIn />
}>
<Show
when=move || user.get().map(|user| user.is_some()).unwrap_or(false)
fallback=|| view!{
<DropDownNotLoggedIn />
}>
<DropDownLoggedIn />
</Show>
</Suspense>
</div>
</Show>
</div>
}
}
#[component]
pub fn DropDownNotLoggedIn() -> impl IntoView {
view! {
<div class="dropdown-logged">
<h1>Not Logged In</h1>
<a href="/login"><button class="auth-button">Log In</button></a>
<a href="/signup"><button class="auth-button">Sign Up</button></a>
</div>
<a href="/login"><button class="auth-button">"Log In"</button></a><br/>
<a href="/signup"><button class="auth-button">"Sign Up"</button></a>
}
}
#[component]
pub fn DropDownLoggedIn() -> impl IntoView {
let logout = move |_| {
let logout = move |_| {
spawn_local(async move {
let result = logout().await;
let result = api::auth::logout().await;
if let Err(err) = result {
log!("Error logging out: {:?}", err);
leptos_log!("Error logging out: {:?}", err);
} else {
let user = GlobalState::logged_in_user();
user.refetch();
log!("Logged out successfully");
let user = GlobalState::logged_in_user();
user.refetch();
leptos_log!("Logged out successfully");
}
});
};
view! {
<div class="dropdown-logged">
<h1>"Logged In"</h1>
<button on:click=logout class="auth-button">Log Out</button>
</div>
<button on:click=logout class="auth-button">"Log Out"</button>
}
}

662
src/components/playbar.rs Normal file
View File

@@ -0,0 +1,662 @@
use crate::prelude::*;
use leptos::ev::MouseEvent;
use leptos::html::{Audio, Div};
use leptos_meta::Title;
/// Width and height of the forward/backward skip buttons
const SKIP_BTN_SIZE: &str = "3em";
/// Width and height of the play/pause button
const PLAY_BTN_SIZE: &str = "4em";
// Width and height of the queue button
const QUEUE_BTN_SIZE: &str = "2.5em";
/// Threshold in seconds for skipping to the previous song instead of skipping to the start of the current song
const MIN_SKIP_BACK_TIME: f64 = 5.0;
/// How many seconds to skip forward/backward when the user presses the arrow keys
const ARROW_KEY_SKIP_TIME: f64 = 5.0;
/// Threshold in seconds for considering when the user has listened to a song, for adding it to the history
const HISTORY_LISTEN_THRESHOLD: u64 = MIN_SKIP_BACK_TIME as u64;
// TODO Handle errors better, when getting audio HTML element and when playing/pausing audio
/// Get the current time and duration of the current song, if available
///
/// # Arguments
///
/// * `status` - The `PlayStatus` to get the audio element from, as a signal
///
/// # Returns
///
/// * `None` if the audio element is not available
/// * `Some((current_time, duration))` if the audio element is available
///
pub fn get_song_time_duration() -> Option<(f64, f64)> {
GlobalState::play_status().with_untracked(|status| {
if let Some(audio) = status.get_audio() {
Some((audio.current_time(), audio.duration()))
} else {
leptos_err!("Unable to get current duration: Audio element not available");
None
}
})
}
/// Skip to a certain time in the current song, optionally playing it
///
/// If the given time is +/- infinity or NaN, logs an error and returns
/// Logs an error if the audio element is not available, or if playing the song fails
///
/// # Arguments
///
/// * `status` - The `PlayStatus` to get the audio element from, as a signal
/// * `time` - The time to skip to, in seconds
///
pub fn skip_to(time: f64) {
if time.is_infinite() || time.is_nan() {
leptos_err!("Unable to skip to non-finite time: {}", time);
return;
}
GlobalState::play_status().update(|status| {
if let Some(audio) = status.get_audio() {
audio.set_current_time(time);
status.playing = true;
leptos_log!("Player skipped to time: {}", time);
} else {
leptos_err!("Unable to skip to time: Audio element not available");
}
});
}
fn toggle_queue() {
GlobalState::play_status().update(|status| {
status.queue_open = !status.queue_open;
});
}
/// The play, pause, and skip buttons
#[component]
fn PlayControls() -> impl IntoView {
let status = GlobalState::play_status();
// On click handlers for the skip and play/pause buttons
let skip_back = move |_| {
if let Some(duration) = get_song_time_duration() {
// Skip to previous song if the current song is near the start
// Also skip to the previous song if we're at the end of the current song
// This is because after running out of songs in the queue, the current song will be at the end
// but our queue will be empty. We *do* want to "skip the start of the current song",
// but first we need to get the *previous* song from the history since that's what we were playing before
if duration.0 < MIN_SKIP_BACK_TIME || duration.0 >= duration.1 {
leptos_log!("Skipping to the previous song");
// Pop the most recently played song from the history if possible
let mut last_played_song = None;
status.update(|status| last_played_song = status.history.pop_back());
if let Some(last_played_song) = last_played_song {
// Push the popped song to the front of the queue, and play it
status.update(|status| status.queue.push_front(last_played_song));
} else {
leptos_warn!("Unable to skip back: No previous song");
}
}
}
// Default to skipping to start of current song, and playing
leptos_log!("Skipping to start of current song");
skip_to(0.0);
};
let skip_forward = move |_| {
if let Some(duration) = get_song_time_duration() {
skip_to(duration.1);
} else {
leptos_err!("Unable to skip forward: Unable to get current duration");
}
};
let toggle_play = move |_| {
status.update(|status| status.playing = !status.playing);
};
// Change the icon based on whether the song is playing or not
let icon = Signal::derive(move || {
status.with(|status| {
if status.playing {
icondata::BsPauseFill
} else {
icondata::BsPlayFill
}
})
});
view! {
<div class="flex place-content-center">
<button class="control" on:click=skip_back>
<Icon width=SKIP_BTN_SIZE height=SKIP_BTN_SIZE icon={icondata::BsSkipStartFill} />
</button>
<button class="control" on:click=toggle_play>
<Icon width=PLAY_BTN_SIZE height=PLAY_BTN_SIZE icon={icon} />
</button>
<button class="control" on:click=skip_forward>
<Icon width=SKIP_BTN_SIZE height=SKIP_BTN_SIZE icon={icondata::BsSkipEndFill} />
</button>
</div>
}
}
/// The elapsed time and total time of the current song
#[component]
fn PlayDuration(elapsed_secs: Signal<i64>, total_secs: Signal<i64>) -> impl IntoView {
// Create a derived signal that formats the elapsed and total seconds into a string
let play_duration = Signal::derive(move || {
let elapsed_mins = (elapsed_secs.get() - elapsed_secs.get() % 60) / 60;
let total_mins = (total_secs.get() - total_secs.get() % 60) / 60;
let elapsed_secs = elapsed_secs.get() % 60;
let total_secs = total_secs.get() % 60;
// Format as "MM:SS / MM:SS"
format!("{elapsed_mins}:{elapsed_secs:0>2} / {total_mins}:{total_secs:0>2}")
});
view! {
<div class="text-controls p-1">
{play_duration}
</div>
}
}
/// The name, artist, and album of the current song
#[component]
fn MediaInfo() -> impl IntoView {
let status = GlobalState::play_status();
let name = Signal::derive(move || {
status.with(|status| {
status
.queue
.front()
.map_or("No media playing".into(), |song| song.title.clone())
})
});
let artist = Signal::derive(move || {
status.with(|status| {
status.queue.front().map_or("".into(), |song| {
backend::Artist::display_list(&song.artists).to_string()
})
})
});
let album = Signal::derive(move || {
status.with(|status| {
status.queue.front().map_or("".into(), |song| {
song.album
.as_ref()
.map_or("".into(), |album| album.title.clone())
})
})
});
let image = Signal::derive(move || {
status.with(|status| {
status
.queue
.front()
.map_or(MUSIC_PLACEHOLDER_WEB_PATH.to_string(), |song| {
song.image_path.clone().path()
})
})
});
view! {
<img class="w-[60px] p-1" src={image}/>
<div class="text-controls p-1">
{name}
<br/>
{artist} - {album}
</div>
}
}
/// The like and dislike buttons
#[component]
fn LikeDislike() -> impl IntoView {
let status = GlobalState::play_status();
let like_icon = Signal::derive(move || {
status.with(|status| match status.queue.front() {
Some(frontend::Song {
like_dislike: Some((true, _)),
..
}) => icondata::TbThumbUpFilled,
_ => icondata::TbThumbUp,
})
});
let dislike_icon = Signal::derive(move || {
status.with(|status| match status.queue.front() {
Some(frontend::Song {
like_dislike: Some((_, true)),
..
}) => icondata::TbThumbDownFilled,
_ => icondata::TbThumbDown,
})
});
let toggle_like = move |_| {
status.update(|status| {
match status.queue.front_mut() {
Some(frontend::Song {
id,
like_dislike: Some((liked, disliked)),
..
}) => {
*liked = !*liked;
if *liked {
*disliked = false;
}
let id = *id;
let liked = *liked;
spawn_local(async move {
if let Err(e) = api::songs::set_like_song(id, liked).await {
leptos_err!("Error liking song: {:?}", e);
}
});
}
Some(frontend::Song {
id, like_dislike, ..
}) => {
// This arm should only be reached if like_dislike is None
// In this case, the buttons will show up not filled, indicating that the song is not
// liked or disliked. Therefore, clicking the like button should like the song.
*like_dislike = Some((true, false));
let id = *id;
spawn_local(async move {
if let Err(e) = api::songs::set_like_song(id, true).await {
leptos_err!("Error liking song: {:?}", e);
}
});
}
_ => {
leptos_log!("Unable to like song: No song in queue");
}
}
});
};
let toggle_dislike = move |_| {
status.update(|status| {
match status.queue.front_mut() {
Some(frontend::Song {
id,
like_dislike: Some((liked, disliked)),
..
}) => {
*disliked = !*disliked;
if *disliked {
*liked = false;
}
let id = *id;
let disliked = *disliked;
spawn_local(async move {
if let Err(e) = api::songs::set_dislike_song(id, disliked).await {
leptos_err!("Error disliking song: {:?}", e);
}
});
}
Some(frontend::Song {
id, like_dislike, ..
}) => {
// This arm should only be reached if like_dislike is None
// In this case, the buttons will show up not filled, indicating that the song is not
// liked or disliked. Therefore, clicking the dislike button should dislike the song.
*like_dislike = Some((false, true));
let id = *id;
spawn_local(async move {
if let Err(e) = api::songs::set_dislike_song(id, true).await {
leptos_err!("Error disliking song: {:?}", e);
}
});
}
_ => {
leptos_log!("Unable to dislike song: No song in queue");
}
}
});
};
view! {
<div class="flex">
<button class="control scale-x-[-1] p-1" on:click=toggle_dislike>
<Icon width=SKIP_BTN_SIZE height=SKIP_BTN_SIZE icon={dislike_icon} />
</button>
<button class="control p-1" on:click=toggle_like>
<Icon width=SKIP_BTN_SIZE height=SKIP_BTN_SIZE icon={like_icon} />
</button>
</div>
}
}
/// The play progress bar, and click handler for skipping to a certain time in the song
#[component]
fn ProgressBar(percentage: Signal<f64>) -> impl IntoView {
// Keep a reference to the progress bar div so we can get its width and calculate the time to skip to
let progress_bar_ref = NodeRef::<Div>::new();
let progress_jump = move |e: MouseEvent| {
let x_click_pos = e.offset_x() as f64;
leptos_log!("Progress bar clicked at x: {}", x_click_pos);
if let Some(progress_bar) = progress_bar_ref.get() {
let width = progress_bar.offset_width() as f64;
let percentage = x_click_pos / width * 100.0;
if let Some(duration) = get_song_time_duration() {
let time = duration.1 * percentage / 100.0;
skip_to(time);
} else {
leptos_err!("Unable to skip to time: Unable to get current duration");
}
} else {
leptos_err!("Unable to skip to time: Progress bar not available");
}
};
// Create a derived signal that formats the song percentage into a CSS style string for width
let bar_width_style = Signal::derive(move || format!("width: {}%;", percentage.get()));
view! {
<div class="w-full h-[14px] translate-y-[50%] pt-[7px] cursor-pointer" node_ref=progress_bar_ref on:click=progress_jump> // Larger click area
<div class="bg-controls-active h-[3px]"> // "Unfilled" progress bar
<div class="from-play-grad-start to-play-grad-end bg-linear-90 h-[3px]"
style=bar_width_style /> // "Filled" progress bar
<div class="from-play-grad-start to-play-grad-end bg-linear-90 h-[3px]
translate-y-[-3px] blur-[3px]" style=bar_width_style /> // "Filled" progress bar blur
</div>
</div>
}
}
#[component]
fn QueueToggle() -> impl IntoView {
let update_queue = move |_| {
toggle_queue();
leptos_log!(
"queue button pressed, queue status: {:?}",
GlobalState::play_status().with_untracked(|status| status.queue_open)
);
};
view! {
<button id="queue-toggle-btn" class="control p-1" on:click=update_queue>
<Icon width=QUEUE_BTN_SIZE height=QUEUE_BTN_SIZE icon={icondata::RiPlayListMediaFill} />
</button>
}
}
/// Renders the title of the page based on the currently playing song
#[component]
pub fn CustomTitle() -> impl IntoView {
let title = Memo::new(move |_| {
GlobalState::play_status().with(|play_status| {
play_status
.queue
.front()
.map_or("LibreTunes".to_string(), |song_data| {
format!(
"{} - {} | {}",
song_data.title.clone(),
backend::Artist::display_list(&song_data.artists),
"LibreTunes"
)
})
})
});
view! {
<Title text=title />
}
}
/// The main play bar component, containing the progress bar, media info, play controls, and play duration
#[component]
pub fn PlayBar() -> impl IntoView {
use web_sys::wasm_bindgen::JsCast;
let status = GlobalState::play_status();
// Listen for key down events -- arrow keys don't seem to trigger key press events
let _arrow_key_handle =
window_event_listener(leptos::ev::keydown, move |e: leptos::ev::KeyboardEvent| {
// Skip if the event target is an input element
if let Some(true) = e
.target()
.map(|t| t.has_type::<web_sys::HtmlInputElement>())
{
return;
}
if e.key() == "ArrowRight" {
e.prevent_default();
leptos_log!(
"Right arrow key pressed, skipping forward by {} seconds",
ARROW_KEY_SKIP_TIME
);
if let Some(duration) = get_song_time_duration() {
let mut time = duration.0 + ARROW_KEY_SKIP_TIME;
time = time.clamp(0.0, duration.1);
skip_to(time);
} else {
leptos_err!("Unable to skip forward: Unable to get current duration");
}
} else if e.key() == "ArrowLeft" {
e.prevent_default();
leptos_log!(
"Left arrow key pressed, skipping backward by {} seconds",
ARROW_KEY_SKIP_TIME
);
if let Some(duration) = get_song_time_duration() {
let mut time = duration.0 - ARROW_KEY_SKIP_TIME;
time = time.clamp(0.0, duration.1);
skip_to(time);
} else {
leptos_err!("Unable to skip backward: Unable to get current duration");
}
}
});
// Listen for space bar presses to play/pause
let _space_bar_handle =
window_event_listener(leptos::ev::keypress, move |e: leptos::ev::KeyboardEvent| {
// Skip if the event target is an input element
if let Some(true) = e
.target()
.map(|t| t.has_type::<web_sys::HtmlInputElement>())
{
return;
}
if e.key() == " " {
e.prevent_default();
leptos_log!("Space bar pressed, toggling play/pause");
status.update(|status| status.playing = !status.playing);
}
});
// Keep a reference to the audio element so we can set its source and play/pause it
let audio_ref = NodeRef::<Audio>::new();
status.update(|status| status.audio_player = Some(audio_ref));
Effect::new(move |_| {
status.with(|status| {
if let Some(audio) = status.get_audio() {
if status.playing {
if let Err(e) = audio.play() {
leptos_log!("Unable to play audio: {:?}", e);
}
} else if let Err(e) = audio.pause() {
leptos_log!("Unable to pause audio: {:?}", e);
}
} else {
leptos_err!("Unable to play/pause audio: Audio element not available");
}
});
});
// Create signals for song time and progress
let (elapsed_secs, set_elapsed_secs) = signal(0);
let (total_secs, set_total_secs) = signal(0);
let (percentage, set_percentage) = signal(0.0);
let current_song_id =
Memo::new(move |_| status.with(|status| status.queue.front().map(|song| song.id)));
let current_song_src = Memo::new(move |_| {
status.with(|status| status.queue.front().map(|song| song.song_path.clone()))
});
Effect::new(move |_| {
current_song_src.with(|src| {
GlobalState::play_status().with_untracked(|status| {
if let Some(audio) = status.get_audio() {
if let Some(src) = src {
audio.set_src(&src.clone().path());
if let Err(e) = audio.play() {
leptos_err!("Error playing audio: {:?}", e);
} else {
leptos_log!("Audio playing");
}
} else {
audio.set_src("");
}
} else {
leptos_err!("Unable to set audio source: Audio element not available");
}
});
});
});
// Track the last song that was added to the history to prevent duplicates
let last_history_song_id = RwSignal::new(None);
let leptos_use::utils::Pausable {
is_active: hist_timeout_pending,
resume: resume_hist_timeout,
pause: pause_hist_timeout,
..
} = use_interval_fn(
move || {
if last_history_song_id.get_untracked() == current_song_id.get_untracked() {
return;
}
if let Some(current_song_id) = current_song_id.get_untracked() {
last_history_song_id.set(Some(current_song_id));
spawn_local(async move {
if let Err(e) = crate::api::history::add_history(current_song_id).await {
leptos_err!("Error adding song {} to history: {}", current_song_id, e);
}
});
}
},
HISTORY_LISTEN_THRESHOLD * 1000,
);
// Initially pause the timeout, since the audio starts off paused
pause_hist_timeout();
let on_play = move |_| {
leptos_log!("Audio playing");
status.update(|status| status.playing = true);
};
let on_pause = move |_| {
leptos_log!("Audio paused");
status.update(|status| status.playing = false);
pause_hist_timeout();
};
let on_time_update = move |_| {
status.with_untracked(|status| {
if let Some(audio) = status.get_audio() {
set_elapsed_secs(audio.current_time() as i64);
set_total_secs(audio.duration() as i64);
if elapsed_secs.get_untracked() > 0 {
set_percentage(
elapsed_secs.get_untracked() as f64 / total_secs.get_untracked() as f64
* 100.0,
);
} else {
set_percentage(0.0);
}
} else {
leptos_err!("Unable to update time: Audio element not available");
}
});
// If time is updated, audio is playing, so make sure the history timeout is running
if !hist_timeout_pending.get_untracked() {
resume_hist_timeout();
}
};
let on_end = move |_| {
leptos_log!("Song ended");
// Move the now-finshed song to the history
// TODO Somehow make sure next song starts playing before repeatedly jumping to next
status.update(|status| {
let prev_song = status.queue.pop_front();
if let Some(prev_song) = prev_song {
leptos_log!("Adding song to history: {}", prev_song.title);
status.history.push_back(prev_song);
} else {
leptos_log!("Queue empty, no previous song to add to history");
}
});
};
view! {
<audio node_ref=audio_ref on:play=on_play on:pause=on_pause
on:timeupdate=on_time_update on:ended=on_end />
<div class="fixed bottom-0 w-full">
<ProgressBar percentage=percentage.into() />
<div class="flex items-center w-full bg-bg-light">
<div class="flex-1 flex">
<MediaInfo />
<LikeDislike />
</div>
<div class="flex-1">
<PlayControls />
</div>
<div class="flex-1 flex flex-col items-end">
<PlayDuration elapsed_secs=elapsed_secs.into() total_secs=total_secs.into() />
<QueueToggle />
</div>
</div>
</div>
}
}

140
src/components/queue.rs Normal file
View File

@@ -0,0 +1,140 @@
use crate::prelude::*;
use leptos::ev::{DragEvent, MouseEvent};
use leptos::html::Div;
const RM_BTN_SIZE: &str = "2.5rem";
fn remove_song_fn(index: usize) {
if index == 0 {
leptos_log!("Error: Trying to remove currently playing song (index 0) from queue");
} else {
leptos_log!(
"Remove Song from Queue: Song is not currently playing, deleting song from queue and not adding to history"
);
GlobalState::play_status().update(|status| {
status.queue.remove(index);
});
}
}
#[component]
pub fn Queue() -> impl IntoView {
let status = GlobalState::play_status();
let remove_song = move |index: usize| {
remove_song_fn(index);
leptos_log!("Removed song {}", index + 1);
};
let prevent_focus = move |e: MouseEvent| {
e.prevent_default();
};
let index_being_dragged = RwSignal::new(-1);
let index_being_hovered = RwSignal::new(-1);
let on_drag_start = move |_e: DragEvent, index: usize| {
// set the index of the item being dragged
index_being_dragged.set(index as i32);
};
let on_drop = move |e: DragEvent| {
e.prevent_default();
// if the index of the item being dragged is not the same as the index of the item being hovered over
if index_being_dragged.get() != index_being_hovered.get()
&& index_being_dragged.get() > 0
&& index_being_hovered.get() > 0
{
// get the index of the item being dragged
let dragged_index = index_being_dragged.get_untracked() as usize;
// get the index of the item being hovered over
let hovered_index = index_being_hovered.get_untracked() as usize;
// update the queue
status.update(|status| {
// remove the dragged item from the list
let dragged_item = status.queue.remove(dragged_index);
// insert the dragged item at the index of the item being hovered over
status.queue.insert(hovered_index, dragged_item.unwrap());
});
// reset the index of the item being dragged
index_being_dragged.set(-1);
// reset the index of the item being hovered over
index_being_hovered.set(-1);
leptos_log!(
"drag end. Moved item from index {} to index {}",
dragged_index,
hovered_index
);
} else {
// reset the index of the item being dragged
index_being_dragged.set(-1);
// reset the index of the item being hovered over
index_being_hovered.set(-1);
}
};
let on_drag_enter = move |_e: DragEvent, index: usize| {
// set the index of the item being hovered over
index_being_hovered.set(index as i32);
};
let on_drag_over = move |e: DragEvent| {
e.prevent_default();
};
let queue = NodeRef::<Div>::new();
let _ = on_click_outside_with_options(
queue,
move |_| {
status.update(|status| {
status.queue_open = false;
});
},
OnClickOutsideOptions::default().ignore(["#queue-toggle-btn"]),
);
view! {
<Show
when=move || status.with(|status| status.queue_open)
fallback=|| view!{""}>
<div class="queue" node_ref=queue>
<div class="queue-header">
<h2>Queue</h2>
</div>
<ul>
{
move || status.with(|status| status.queue.iter()
.enumerate()
.map(|(index, song)| view! {
<div class="queue-item"
draggable="true"
on:dragstart=move |e: DragEvent| on_drag_start(e, index)
on:drop=on_drop
on:dragenter=move |e: DragEvent| on_drag_enter(e, index)
on:dragover=on_drag_over
>
<Song
song_image_path=song.image_path.clone().path()
song_title=song.title.clone()
song_artist=backend::Artist::display_list(&song.artists) />
<Show
when=move || index != 0
fallback=|| view!{
<p>Playing</p>
}>
<button on:click=move |_| remove_song(index) on:mousedown=prevent_focus>
<Icon width=RM_BTN_SIZE height=RM_BTN_SIZE icon={icondata::CgTrash} {..} class="remove-song" />
</button>
</Show>
</div>
})
.collect::<Vec<_>>())
}
</ul>
</div>
</Show>
}
}

View File

@@ -1,10 +0,0 @@
use leptos::*;
#[component]
pub fn Search() -> impl IntoView {
view! {
<div class="search-container home-component">
<h1>Searching...</h1>
</div>
}
}

View File

@@ -1,54 +1,154 @@
use leptos::leptos_dom::*;
use leptos::*;
use leptos_icons::*;
use crate::components::upload::*;
use crate::prelude::*;
use leptos::html::Div;
use leptos_router::components::{A, Form};
use leptos_router::hooks::use_location;
use std::sync::Arc;
use web_sys::Response;
#[component]
pub fn Sidebar(upload_open: RwSignal<bool>) -> impl IntoView {
use leptos_router::use_location;
pub fn Sidebar(
upload_open: RwSignal<bool>,
add_artist_open: RwSignal<bool>,
add_album_open: RwSignal<bool>,
) -> impl IntoView {
view! {
<div class="flex flex-col w-[250px] min-w-[250px]">
<Menu upload_open add_artist_open add_album_open />
<Playlists />
</div>
}
}
#[component]
fn AddPlaylistDialog(open: RwSignal<bool>, node_ref: NodeRef<Div>) -> impl IntoView {
let playlist_name = RwSignal::new("".to_string());
let loading = RwSignal::new(false);
let error_msg = RwSignal::new(None);
let handle_response = Arc::new(move |response: &Response| {
loading.set(false);
if response.ok() {
open.set(false);
GlobalState::playlists().refetch();
} else {
error_msg.set(Some("Failed to create playlist".to_string()));
}
});
view! {
<dialog class="fixed top-0 left-0 w-full h-full bg-black/50 flex items-center justify-center" class:open=open>
<div node_ref=node_ref class="bg-neutral-800 rounded-lg p-4 w-1/3 text-white">
<div class="flex items-center pb-3">
<h1 class="text-2xl">"Create Playlist"</h1>
<button id="add-playlist-dialog-btn" class="control ml-auto" on:click=move |_| open.set(false)>
<Icon icon={icondata::IoClose} {..} class="w-7 h-7" />
</button>
</div>
<Form action="/api/playlists/create" on_response=handle_response.clone()
method="POST" enctype="multipart/form-data".to_string()>
<div class="grid grid-cols-[auto_1fr] gap-4">
<label for="new-playlist-name">"Playlist Name"</label>
<input id="new-playlist-name" name="name"
class="bg-neutral-800 text-neutral-200 border border-neutral-600 rounded-lg p-2 outline-none"
type="text" placeholder="My Playlist" bind:value=playlist_name required autocomplete="off" />
<label for="new-playlist-img">"Cover Image"</label>
<input id="new-playlist-img" name="picture" type="file" accept="image/*" />
</div>
{move || {
error_msg.get().map(|error| {
view! {
<Error<String>
message=error.clone()
/>
}
})
}}
<div class="flex justify-end">
<button type="submit" class="control-solid" on:click=move |_| {
error_msg.set(None);
loading.set(true);
}>
"Create"
</button>
</div>
</Form>
</div>
</dialog>
}
}
#[component]
pub fn Playlists() -> impl IntoView {
let location = use_location();
let on_dashboard = Signal::derive(
move || location.pathname.get().starts_with("/dashboard") || location.pathname.get() == "/",
);
let liked_songs_active = Signal::derive(move || location.pathname.get().ends_with("/liked"));
let on_search = Signal::derive(
move || location.pathname.get().starts_with("/search"),
let add_playlist_open = RwSignal::new(false);
let create_playlist = move |_| {
leptos::logging::log!("Creating playlist");
add_playlist_open.set(true);
};
let add_playlist_dialog = NodeRef::<Div>::new();
let _dialog_close_handler = on_click_outside_with_options(
add_playlist_dialog,
move |_| add_playlist_open.set(false),
OnClickOutsideOptions::default().ignore(["#add-playlist-dialog-btn"]),
);
view! {
<div class="sidebar-container">
<div class="sidebar-top-container">
<h2 class="header">LibreTunes</h2>
<UploadBtn dialog_open=upload_open />
<a class="buttons" href="/dashboard" style={move || if on_dashboard() {"color: #e1e3e1"} else {""}} >
<Icon icon=icondata::OcHomeFillLg />
<h1>Dashboard</h1>
</a>
<a class="buttons" href="/search" style={move || if on_search() {"color: #e1e3e1"} else {""}}>
<Icon icon=icondata::BiSearchRegular />
<h1>Search</h1>
</a>
</div>
<Bottom />
</div>
}
}
#[component]
pub fn Bottom() -> impl IntoView {
view! {
<div class="sidebar-bottom-container">
<div class="heading">
<h1 class="header">Playlists</h1>
<button class="add-playlist">
<div class="add-sign">
<Icon icon=icondata::IoAddSharp />
</div>
New Playlist
<div class="home-card">
<div class="flex items-center mb-2">
<h1 class="p-2 text-xl">"Playlists"</h1>
<button class="control-solid ml-auto" on:click=create_playlist>
<Icon icon={icondata::AiPlusOutlined} {..} class="w-4 h-4" />
</button>
</div>
<div>
<A href={"/liked".to_string()} {..}
style={move || if liked_songs_active() {"background-color: var(--color-neutral-700);"} else {""}}
class="flex items-center hover:bg-neutral-700 rounded-md my-1"
>
<img class="w-15 h-15 rounded-xl p-2"
src=MUSIC_PLACEHOLDER_WEB_PATH />
<h2 class="pr-3 my-2">"Liked Songs"</h2>
</A>
<LoadResource
resource={GlobalState::playlists()}
let:playlists
>
{playlists.into_iter().map(|playlist| {
let active = Signal::derive(move || {
location.pathname.get().ends_with(&format!("/playlist/{}", playlist.id))
});
view! {
<A href={format!("/playlist/{}", playlist.id)} {..}
style={move || if active() {"background-color: var(--color-neutral-700);"} else {""}}
class="flex items-center hover:bg-neutral-700 rounded-md my-1" >
<img class="w-15 h-15 rounded-xl p-2 object-cover"
src=playlist.image_path.path() />
<h2 class="pr-3 my-2">{playlist.name}</h2>
</A>
}
}).collect::<Vec<_>>()}
</LoadResource>
</div>
</div>
<Show
when=add_playlist_open
fallback=move || view! {}
>
<AddPlaylistDialog node_ref=add_playlist_dialog open=add_playlist_open />
</Show>
}
}

14
src/components/song.rs Normal file
View File

@@ -0,0 +1,14 @@
use crate::prelude::*;
#[component]
pub fn Song(song_image_path: String, song_title: String, song_artist: String) -> impl IntoView {
view! {
<div class="queue-song">
<img src={song_image_path} alt={song_title.clone()} />
<div class="queue-song-info">
<h3>{song_title}</h3>
<p>{song_artist}</p>
</div>
</div>
}
}

View File

@@ -1,270 +0,0 @@
use std::rc::Rc;
use leptos::*;
use leptos::logging::*;
use leptos_icons::*;
use crate::api::songs::*;
use crate::songdata::SongData;
use crate::models::{Album, Artist};
use crate::util::state::GlobalState;
const LIKE_DISLIKE_BTN_SIZE: &str = "2em";
#[component]
pub fn SongList(songs: Vec<SongData>) -> impl IntoView {
__SongListInner(songs.into_iter().map(|song| (song, ())).collect::<Vec<_>>(), false)
}
#[component]
pub fn SongListExtra<T>(songs: Vec<(SongData, T)>) -> impl IntoView where
T: Clone + IntoView + 'static
{
__SongListInner(songs, true)
}
#[component]
fn SongListInner<T>(songs: Vec<(SongData, T)>, show_extra: bool) -> impl IntoView where
T: Clone + IntoView + 'static
{
let songs = Rc::new(songs);
let songs_2 = songs.clone();
// Signal that acts as a callback for a song list item to queue songs after it in the list
let (handle_queue_remaining, do_queue_remaining) = create_signal(None);
create_effect(move |_| {
let clicked_index = handle_queue_remaining.get();
if let Some(index) = clicked_index {
GlobalState::play_status().update(|status| {
let song: &(SongData, T) = songs.get(index).expect("Invalid song list item index");
if status.queue.front().map(|song| song.id) == Some(song.0.id) {
// If the clicked song is already at the front of the queue, just play it
status.playing = true;
} else {
// Otherwise, add the currently playing song to the history,
// clear the queue, and queue the clicked song and other after it
if let Some(last_playing) = status.queue.pop_front() {
status.history.push_back(last_playing);
}
status.queue.clear();
status.queue.extend(songs.iter().skip(index).map(|(song, _)| song.clone()));
status.playing = true;
}
});
}
});
view! {
<table class="song-list">
{
songs_2.iter().enumerate().map(|(list_index, (song, extra))| {
let song_id = song.id;
let playing = create_rw_signal(false);
create_effect(move |_| {
GlobalState::play_status().with(|status| {
playing.set(status.queue.front().map(|song| song.id) == Some(song_id) && status.playing);
});
});
view! {
<SongListItem song={song.clone()} song_playing=playing.into()
extra={if show_extra { Some(extra.clone()) } else { None }} list_index do_queue_remaining/>
}
}).collect::<Vec<_>>()
}
</table>
}
}
#[component]
pub fn SongListItem<T>(song: SongData, song_playing: MaybeSignal<bool>, extra: Option<T>,
list_index: usize, do_queue_remaining: WriteSignal<Option<usize>>) -> impl IntoView where
T: IntoView + 'static
{
let liked = create_rw_signal(song.like_dislike.map(|(liked, _)| liked).unwrap_or(false));
let disliked = create_rw_signal(song.like_dislike.map(|(_, disliked)| disliked).unwrap_or(false));
view! {
<tr class="song-list-item">
<td class="song-image"><SongImage image_path=song.image_path song_playing
list_index do_queue_remaining /></td>
<td class="song-title"><p>{song.title}</p></td>
<td class="song-list-spacer"></td>
<td class="song-artists"><SongArtists artists=song.artists /></td>
<td class="song-list-spacer"></td>
<td class="song-album"><SongAlbum album=song.album /></td>
<td class="song-list-spacer-big"></td>
<td class="song-like-dislike"><SongLikeDislike song_id=song.id liked disliked/></td>
<td>{format!("{}:{:02}", song.duration / 60, song.duration % 60)}</td>
{extra.map(|extra| view! {
<td class="song-list-spacer"></td>
<td>{extra}</td>
})}
</tr>
}
}
/// Display the song's image, with an overlay if the song is playing
/// When the song list item is hovered, the overlay will show the play button
#[component]
fn SongImage(image_path: String, song_playing: MaybeSignal<bool>, list_index: usize,
do_queue_remaining: WriteSignal<Option<usize>>) -> impl IntoView
{
let play_song = move |_| {
do_queue_remaining.set(Some(list_index));
};
let pause_song = move |_| {
GlobalState::play_status().update(|status| {
status.playing = false;
});
};
view! {
<img class="song-image" src={image_path}/>
{move || if song_playing.get() {
view! { <Icon class="song-image-overlay song-playing-overlay"
icon=icondata::BsPauseFill on:click=pause_song /> }.into_view()
} else {
view! { <Icon class="song-image-overlay hide-until-hover"
icon=icondata::BsPlayFill on:click=play_song /> }.into_view()
}}
}
}
/// Displays a song's artists, with links to their artist pages
#[component]
fn SongArtists(artists: Vec<Artist>) -> impl IntoView {
let num_artists = artists.len() as isize;
artists.iter().enumerate().map(|(i, artist)| {
let i = i as isize;
view! {
{
if let Some(id) = artist.id {
view! { <a href={format!("/artist/{}", id)}>{artist.name.clone()}</a> }.into_view()
} else {
view! { <span>{artist.name.clone()}</span> }.into_view()
}
}
{if i < num_artists - 2 { ", " } else if i == num_artists - 2 { " & " } else { "" }}
}
}).collect::<Vec<_>>()
}
/// Display a song's album, with a link to the album page
#[component]
fn SongAlbum(album: Option<Album>) -> impl IntoView {
album.as_ref().map(|album| {
view! {
<span>
{
if let Some(id) = album.id {
view! { <a href={format!("/album/{}", id)}>{album.title.clone()}</a> }.into_view()
} else {
view! { <span>{album.title.clone()}</span> }.into_view()
}
}
</span>
}
})
}
/// Display like and dislike buttons for a song, and indicate if the song is liked or disliked
#[component]
fn SongLikeDislike(
#[prop(into)]
song_id: MaybeSignal<i32>,
liked: RwSignal<bool>,
disliked: RwSignal<bool>) -> impl IntoView
{
let like_icon = Signal::derive(move || {
if liked.get() {
icondata::TbThumbUpFilled
} else {
icondata::TbThumbUp
}
});
let dislike_icon = Signal::derive(move || {
if disliked.get() {
icondata::TbThumbDownFilled
} else {
icondata::TbThumbDown
}
});
let like_class = MaybeProp::derive(move || {
if liked.get() {
Some(TextProp::from("controlbtn"))
} else {
Some(TextProp::from("controlbtn hide-until-hover"))
}
});
let dislike_class = MaybeProp::derive(move || {
if disliked.get() {
Some(TextProp::from("controlbtn hmirror"))
} else {
Some(TextProp::from("controlbtn hmirror hide-until-hover"))
}
});
// If an error occurs, check the like/dislike status again to ensure consistency
let check_like_dislike = move || {
spawn_local(async move {
match get_like_dislike_song(song_id.get_untracked()).await {
Ok((like, dislike)) => {
liked.set(like);
disliked.set(dislike);
},
Err(_) => {}
}
});
};
let toggle_like = move |_| {
let new_liked = !liked.get_untracked();
liked.set(new_liked);
disliked.set(disliked.get_untracked() && !liked.get_untracked());
spawn_local(async move {
match set_like_song(song_id.get_untracked(), new_liked).await {
Ok(_) => {},
Err(e) => {
error!("Error setting like: {}", e);
check_like_dislike();
}
}
});
};
let toggle_dislike = move |_| {
disliked.set(!disliked.get_untracked());
liked.set(liked.get_untracked() && !disliked.get_untracked());
spawn_local(async move {
match set_dislike_song(song_id.get_untracked(), disliked.get_untracked()).await {
Ok(_) => {},
Err(e) => {
error!("Error setting dislike: {}", e);
check_like_dislike();
}
}
});
};
view! {
<button on:click=toggle_dislike>
<Icon class=dislike_class width=LIKE_DISLIKE_BTN_SIZE height=LIKE_DISLIKE_BTN_SIZE icon=dislike_icon />
</button>
<button on:click=toggle_like>
<Icon class=like_class width=LIKE_DISLIKE_BTN_SIZE height=LIKE_DISLIKE_BTN_SIZE icon=like_icon />
</button>
}
}

View File

@@ -0,0 +1,434 @@
use crate::prelude::*;
use leptos::html::Div;
use std::ops::Range;
/// Render this many times the amount of songs that are visible
const SONG_RENDER_MULTIPLIER: f64 = 5.;
/// Render this many songs in a group
const SONG_GROUP_SIZE: usize = 30;
/// Type for fetching Songs
type SongListResource = ArcResource<BackendResult<Vec<Option<frontend::Song>>>>;
/// Container to store a group of songs being fetched and rendered
#[derive(Clone)]
pub struct SongGroup {
resource: SongListResource,
song_ids: Vec<i32>,
list_range: Range<usize>,
}
// Implementation of PartialEq to be used in a memo
// Assume that two `SongGroup` fetching the same song ids are equivalent
impl PartialEq for SongGroup {
fn eq(&self, other: &Self) -> bool {
self.song_ids == other.song_ids
}
}
/// Container to store a group of songs being fetched and rendered, with extra data
#[derive(Clone)]
pub struct SongGroupExt<T> {
resource: SongListResource,
extra: Vec<T>,
song_ids: Vec<i32>,
list_range: Range<usize>,
}
// Implementation of PartialEq to be used in a memo
// Assume that two `SongGroupExt` fetching the same song ids are equivalent
impl<T> PartialEq for SongGroupExt<T> {
fn eq(&self, other: &Self) -> bool {
self.song_ids == other.song_ids
}
}
/// Calculate the range of song list indexes to render, and the size of the spacer above the first
/// song group
fn get_render_range(
songlist_elem: NodeRef<Div>,
song_count: Signal<usize>,
) -> Memo<(Range<usize>, f64)> {
use std::cmp::max;
// Measure the top of the song list component and top of the playbar to find the visible range
let UseElementBoundingReturn {
top: songlist_top, ..
} = use_element_bounding(songlist_elem);
let UseElementBoundingReturn {
top: playbar_top, ..
} = use_element_bounding(GlobalState::playbar_element());
Memo::new(move |_| {
let song_count = song_count.get();
let songlist_top = songlist_top.get();
let playbar_top = playbar_top.get();
// Index of the topmost visible song in the song list
let topmost_visible_song_index = max((-songlist_top / SONG_ROW_HEIGHT) as usize, 0);
// The number of currently visible songs displayed
// NOTE: This doesn't include anything below the SongList component on the screen.
// E.g. If the page is scrolled all the way down, and stuff below the SongList takes
// up half the screen, it will still report a full screen's worth of visible songs.
// This is fine because we probably want to load songs at the bottom of the list
// anyway, and in order to scroll this far down, we already had to load them,
// so no use throwing them away really
let number_visible_songs = (playbar_top - songlist_top.max(0.0)).max(0.0) / SONG_ROW_HEIGHT;
// Number of songs to render above and below the center song in the visible area
let render_count = (number_visible_songs * SONG_RENDER_MULTIPLIER / 2.0).ceil() as isize;
// The index of the song that is in the center of the visible area
let center_song_index = (topmost_visible_song_index as f64 + (number_visible_songs / 2.0))
.max(0.0)
.floor() as isize;
// The indexes of the first and last songs to load and render
let first_song_index = (center_song_index - render_count).max(0) as usize;
// Round down to nearest SONG_RENDER_GROUP_SIZE
let first_song_index = (first_song_index / SONG_GROUP_SIZE) * SONG_GROUP_SIZE;
let first_song_index = first_song_index.min(song_count.max(1) - 1);
let last_song_index = (center_song_index + render_count).max(0) as usize;
// Round up to nearest SONG_RENDER_GROUP_SIZE
let last_song_index = last_song_index.div_ceil(SONG_GROUP_SIZE) * SONG_GROUP_SIZE;
let last_song_index = last_song_index.min(song_count);
let render_range = first_song_index..last_song_index;
// The size, in pixels, of the space above the first rendered song
let top_spacer_size = render_range.start as f64 * SONG_ROW_HEIGHT;
(render_range, top_spacer_size)
})
}
/// Render a group of songs
pub trait RenderGroup<T>
where
T: Send + Sync + 'static + Clone,
Self: Send + Sync + Sized + 'static + PartialEq + Clone,
{
/// Arguments provided to row render functions
type RowArgs: Send;
/// The number of rows in this group
fn size(&self) -> usize {
self.list_range().len()
}
/// The range of the overall list this group takes up
fn list_range(&self) -> Range<usize>;
/// Create `RowArgs` from `Song`s
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>>;
/// Access the inner song data resource
fn resource(&self) -> SongListResource;
/// Render a group
fn render<F, V>(
self,
col_count: usize,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView + use<Self, T, F, V>
where
V: IntoView + 'static,
F: Fn(Self::RowArgs) -> V + Send + Sync + 'static,
{
let group_size = self.size();
let loading_rows = move || {
(0..group_size)
.map(|_| {
view! { <SongLoading col_count /> }
})
.collect::<Vec<_>>()
};
view! {
<Transition
fallback=loading_rows
>
{move || {
self.resource().get().map(|songs| {
match songs {
Ok(songs) => Either::Left({
self.join_data(songs, play_callback).map(|row_args| {
match row_args {
Some(row_args) => Either::Left({
view! {
<SongRowWrapper col_count >
{
cols.read_value()(row_args)
}
</SongRowWrapper>
}
}),
None => Either::Right({
let error = AccessError::NotFound;
view! {
<SongError col_count error />
}
}),
}
}).collect::<Vec<_>>()
}),
Err(error) => Either::Right(
(0..group_size).map(|_| view! {
<SongError col_count error={error.clone()} />
}).collect::<Vec<_>>()
)
}
})
}}
</Transition>
}
}
}
impl RenderGroup<i32> for SongGroup {
type RowArgs = (usize, PlayCallback, frontend::Song);
fn list_range(&self) -> Range<usize> {
self.list_range.clone()
}
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>> {
let idx_offset = RenderGroup::list_range(self).start;
songs.into_iter().enumerate().map(move |(song_idx, song)| {
song.map(|song| (song_idx + idx_offset, play_callback, song))
})
}
fn resource(&self) -> SongListResource {
self.resource.clone()
}
}
impl<T> RenderGroup<(i32, T)> for SongGroupExt<T>
where
T: Send + Sync + 'static + Clone,
{
type RowArgs = (usize, PlayCallback, frontend::Song, T);
fn list_range(&self) -> Range<usize> {
self.list_range.clone()
}
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>> {
let idx_offset = RenderGroup::list_range(self).start;
std::iter::zip(songs, self.extra.clone()).enumerate().map(
move |(song_idx, (song, extra))| {
song.map(|song| (song_idx + idx_offset, play_callback, song, extra))
},
)
}
fn resource(&self) -> SongListResource {
self.resource.clone()
}
}
/// Generate song list groups from a `Signal` of the whole song list
pub trait GenerateGroups<T>
where
T: Send + Sync + 'static + Clone,
Self: Send + Sync + Sized + 'static + PartialEq + Clone,
{
/// Arguments provided to row render functions
type RowArgs: Send;
/// Create group from song ids
fn fetch(song_ids: &[T], list_range: Range<usize>) -> Self;
/// Determine whether this group is for the given song ids
fn matches_ids(&self, song_ids: &[T]) -> bool;
/// Get the size of this group in the list
fn size(&self) -> usize {
self.list_range().len()
}
/// The range of the overall list this group takes up
fn list_range(&self) -> Range<usize>;
/// Create `RowArgs` from `Song`s
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>>;
/// Access the inner song data resource
fn resource(&self) -> SongListResource;
/// Generate song groups from the `Signal`
fn generate_groups(
song_ids: Signal<Vec<T>>,
songlist_elem: NodeRef<Div>,
) -> Memo<(f64, usize, Vec<Self>)> {
// "Version" of song IDs. Used to trigger re-fetching of songs when the IDs change.
// Consistently increments with each change to the song IDs.
let song_ids_version = Memo::new(move |prev| {
song_ids.track();
prev.map_or(0, |v| v + 1)
});
let song_count = Signal::derive(move || song_ids.read().len());
let render_range = get_render_range(songlist_elem, song_count);
// Memo for a tuple of first and last index to fetch/render, and the spacer size
// This is all grouped together to avoid race conditions (one updating before the other)
Memo::new(move |prev: Option<&(f64, usize, Vec<Self>)>| {
let song_ids = song_ids.get();
let song_ids_version = song_ids_version.get();
let (render_range, top_spacer_size) = render_range.get();
let render_groups: Vec<Self> =
if let Some((_, prev_song_ids_version, prev_render_groups)) = prev
&& song_ids_version == *prev_song_ids_version
{
song_ids[render_range.clone()]
.chunks(SONG_GROUP_SIZE)
.enumerate()
.map(|(chunk_idx, chunk)| {
if let Some(group) = prev_render_groups
.iter()
.find(|group| group.matches_ids(chunk))
{
(*group).clone()
} else {
let song_idx = (chunk_idx * SONG_GROUP_SIZE) + render_range.start;
let range = song_idx..(song_idx + chunk.len());
Self::fetch(chunk, range)
}
})
.collect()
} else {
// Create all-new groups if version changed
song_ids[render_range.clone()]
.chunks(SONG_GROUP_SIZE)
.enumerate()
.map(|(chunk_idx, chunk)| {
let song_idx = (chunk_idx * SONG_GROUP_SIZE) + render_range.start;
let range = song_idx..(song_idx + chunk.len());
Self::fetch(chunk, range)
})
.collect::<Vec<_>>()
};
(top_spacer_size, song_ids_version, render_groups)
})
}
}
impl<T> GenerateGroups<(i32, T)> for SongGroupExt<T>
where
T: Send + Sync + 'static + Clone,
{
type RowArgs = (usize, PlayCallback, frontend::Song, T);
fn fetch(song_ids: &[(i32, T)], list_range: Range<usize>) -> Self {
let (song_ids, extra): (Vec<_>, Vec<_>) = song_ids.iter().cloned().unzip();
let song_ids_2 = song_ids.clone();
let resource = ArcResource::new(move || song_ids_2.clone(), api::songs::get_songs_by_id);
Self {
resource,
extra,
song_ids,
list_range,
}
}
fn matches_ids(&self, song_ids: &[(i32, T)]) -> bool {
self.song_ids == song_ids.iter().map(|(id, _extra)| *id).collect::<Vec<_>>()
}
fn list_range(&self) -> Range<usize> {
self.list_range.clone()
}
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>> {
let idx_offset = RenderGroup::list_range(self).start;
std::iter::zip(songs, self.extra.clone()).enumerate().map(
move |(song_idx, (song, extra))| {
song.map(|song| (song_idx + idx_offset, play_callback, song, extra))
},
)
}
fn resource(&self) -> SongListResource {
self.resource.clone()
}
}
impl GenerateGroups<i32> for SongGroup {
type RowArgs = (usize, PlayCallback, frontend::Song);
fn fetch(song_ids: &[i32], list_range: Range<usize>) -> Self {
let song_ids = song_ids.to_vec();
let song_ids_2 = song_ids.clone();
let resource = ArcResource::new(move || song_ids_2.clone(), api::songs::get_songs_by_id);
Self {
resource,
song_ids,
list_range,
}
}
fn matches_ids(&self, song_ids: &[i32]) -> bool {
self.song_ids == song_ids
}
fn list_range(&self) -> Range<usize> {
self.list_range.clone()
}
fn join_data(
&self,
songs: Vec<Option<frontend::Song>>,
play_callback: PlayCallback,
) -> impl Iterator<Item = Option<Self::RowArgs>> {
let idx_offset = RenderGroup::list_range(self).start;
songs.into_iter().enumerate().map(move |(song_idx, song)| {
song.map(|song| (song_idx + idx_offset, play_callback, song))
})
}
fn resource(&self) -> SongListResource {
self.resource.clone()
}
}

View File

@@ -0,0 +1,17 @@
pub mod groups;
pub mod song;
pub mod song_list;
pub mod song_list_defaults;
pub mod song_list_header;
pub mod all {
use super::*;
pub use song::*;
pub use song_list::{CustomSongList, DisplaySongList, PlayCallback, SONG_ROW_HEIGHT, SongList};
pub use groups::{GenerateGroups, RenderGroup, SongGroup, SongGroupExt};
pub use song_list_defaults::*;
pub use song_list_header::*;
}

View File

@@ -0,0 +1,283 @@
use crate::prelude::*;
/// Width and height of the song image, in px
pub const SONG_IMAGE_SIZE: f64 = 37.;
const LIKE_DISLIKE_BTN_SIZE: &str = "1.7em";
/// Song row wrapper, sets up subgrid and styling of row
#[component]
pub fn SongRowWrapper(col_count: usize, children: Children) -> impl IntoView {
let style =
format!("grid-column: span {col_count} / span {col_count}; height: {SONG_ROW_HEIGHT}px;");
view! {
<div class="hover:bg-neutral-700 border-t border-neutral-600 group
grid grid-cols-subgrid items-center *:min-w-0 *:truncate pl-2" style=style>
{children()}
</div>
}
}
/// A full-width song row wrapper, allowing any children to span the full song row subgrid width
#[component]
pub fn FullSongRowWrapper(col_count: usize, children: Children) -> impl IntoView {
let grid_col_style = format!("grid-column: span {col_count} / span {col_count};");
view! {
<SongRowWrapper col_count>
<div style=grid_col_style>
{children()}
</div>
</SongRowWrapper>
}
}
/// Full-width song row loading indicator
#[component]
pub fn SongLoading(col_count: usize) -> impl IntoView {
view! {
<FullSongRowWrapper col_count>
<Loading/>
</FullSongRowWrapper>
}
}
/// Full-width song row error
#[component]
pub fn SongError(col_count: usize, #[prop(into)] error: BackendError) -> impl IntoView {
view! {
<FullSongRowWrapper col_count>
{error.to_component()}
</FullSongRowWrapper>
}
}
/// Song list index
#[component]
pub fn SongListIndex(list_index: usize) -> impl IntoView {
view! {
<div class="justify-self-center truncate">
{list_index + 1}
</div>
}
}
/// A simple song playing check. Determines if the currently
/// playing song id matches the given song id
fn get_song_playing(song_id: i32) -> Signal<bool> {
Signal::derive(move || {
GlobalState::play_status().with(|status| {
status.queue.front().map(|song| song.id) == Some(song_id) && status.playing
})
})
}
#[component]
pub fn SongImage(
song: frontend::Song,
#[prop(default = get_song_playing(song.id))] song_playing: Signal<bool>,
list_index: usize,
play_callback: PlayCallback,
#[prop(default = true)] play_button: bool,
) -> impl IntoView {
let icon = Signal::derive(move || {
if song_playing.get() {
icondata::BsPauseFill
} else {
icondata::BsPlayFill
}
});
let icon_style = Signal::derive(move || {
if song_playing.get() {
"w-6 h-6 absolute top-1/2 left-1/2 translate-[-50%]"
} else {
"w-6 h-6 opacity-0 group-hover:opacity-100 absolute top-1/2 left-1/2 translate-[-50%]"
}
});
let song_image_size_style = format!("height: {SONG_IMAGE_SIZE}px; width: {SONG_IMAGE_SIZE}px;");
view! {
<div class="relative" style={song_image_size_style.clone()}>
<img class="group-hover:brightness-45" src={song.image_path.clone().path()} style={song_image_size_style.clone()} />
{
play_button.then(|| {
let toggle_play = move |_| {
if song_playing.get() {
GlobalState::play_status().update(|status| {
status.playing = false;
});
} else {
play_callback.run((list_index, song.clone()));
}
};
view! {
<Icon icon on:click={toggle_play} {..} class=icon_style />
}
})
}
</div>
}
}
#[component]
pub fn SongTitleArtists(title: String, artists: Vec<backend::Artist>) -> impl IntoView {
view! {
<div class="truncate">
<div class="font-semibold truncate">
{title}
</div>
<div class="text-neutral-400 truncate">
<ArtistList artists />
</div>
</div>
}
}
#[component]
pub fn SongAlbum(album: Option<backend::Album>) -> impl IntoView {
if let Some(album) = album {
Either::Left(view! {
<div class="truncate min-w-0">
<a class="hover:underline active:text-controls-active"
href={format!("/album/{}", album.id)}>{album.title.clone()}</a>
</div>
})
} else {
Either::Right(view! {
<div/>
})
}
}
/// Display like and dislike buttons for a song, and indicate if the song is liked or disliked
#[component]
pub fn SongLikeDislike(
song_id: i32,
liked: RwSignal<bool>,
disliked: RwSignal<bool>,
) -> impl IntoView {
let like_icon = Signal::derive(move || {
if liked.get() {
icondata::TbThumbUpFilled
} else {
icondata::TbThumbUp
}
});
let dislike_icon = Signal::derive(move || {
if disliked.get() {
icondata::TbThumbDownFilled
} else {
icondata::TbThumbDown
}
});
let like_class = Signal::derive(move || {
if liked.get() {
""
} else {
"opacity-0 group-hover:opacity-100"
}
});
let dislike_class = Signal::derive(move || {
if disliked.get() {
""
} else {
"opacity-0 group-hover:opacity-100"
}
});
// If an error occurs, check the like/dislike status again to ensure consistency
let check_like_dislike = move || {
spawn_local(async move {
if let Ok((like, dislike)) = api::songs::get_like_dislike_song(song_id).await {
liked.set(like);
disliked.set(dislike);
}
});
};
let toggle_like = move |_| {
let new_liked = !liked.get_untracked();
liked.set(new_liked);
disliked.set(disliked.get_untracked() && !liked.get_untracked());
spawn_local(async move {
match api::songs::set_like_song(song_id, new_liked).await {
Ok(_) => {}
Err(e) => {
leptos_err!("Error setting like: {}", e);
check_like_dislike();
}
}
});
};
let toggle_dislike = move |_| {
disliked.set(!disliked.get_untracked());
liked.set(liked.get_untracked() && !disliked.get_untracked());
spawn_local(async move {
match api::songs::set_dislike_song(song_id, disliked.get_untracked()).await {
Ok(_) => {}
Err(e) => {
leptos_err!("Error setting dislike: {}", e);
check_like_dislike();
}
}
});
};
view! {
<div class="flex">
<button class="control scale-x-[-1]" on:click=toggle_dislike>
<Icon
width=LIKE_DISLIKE_BTN_SIZE
height=LIKE_DISLIKE_BTN_SIZE
icon={dislike_icon}
{..} class=dislike_class />
</button>
<button class="control" on:click=toggle_like>
<Icon
width=LIKE_DISLIKE_BTN_SIZE
height=LIKE_DISLIKE_BTN_SIZE
icon={like_icon}
{..} class=like_class />
</button>
</div>
}
}
#[component]
pub fn SongDuration(duration: i32) -> impl IntoView {
view! {
<div class="text-neutral-300 truncate">
{format!("{}:{:02}", duration / 60, duration % 60)}
</div>
}
}
#[component]
pub fn SongPlays(plays: i64) -> impl IntoView {
let plays_formatted = plays
.to_string()
.as_bytes()
.rchunks(3)
.rev()
.map(std::str::from_utf8)
.collect::<Result<Vec<&str>, _>>()
.unwrap()
.join(",");
view! {
<div class="text-neutral-300 truncate">
{plays_formatted}
</div>
}
}

View File

@@ -0,0 +1,245 @@
use crate::prelude::*;
use leptos::html::Div;
/// Hard-coded height of a song list row. Needed to calculate top spacer width
pub const SONG_ROW_HEIGHT: f64 = 50.0;
/// Callback for clicking a play button on a song in a song list
/// Passes the list index, and song id
pub type PlayCallback = Callback<(usize, frontend::Song)>;
/// Display a song list from a reactive source of song ids
pub trait DisplaySongList {
/// Arguments provided to row render functions
type RowArgs;
/// Group type to render
type Group;
fn display_inner<T, F, V>(
song_ids: Signal<Vec<T>>,
headers: Vec<SongListHeader>,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView
where
T: Send + Sync + 'static + Clone,
Self::Group: GenerateGroups<T>,
Self::Group: RenderGroup<T, RowArgs = <Self::Group as GenerateGroups<T>>::RowArgs>,
V: IntoView + 'static,
F: Fn(<Self::Group as GenerateGroups<T>>::RowArgs) -> V + Send + Sync + 'static,
{
let songlist_elem = NodeRef::<Div>::new();
let resources = GenerateGroups::<T>::generate_groups(song_ids, songlist_elem);
let songlist_height_style = Signal::derive(move || {
let height = song_ids.read().len() as f64 * SONG_ROW_HEIGHT;
// Accounts for spacing of the header row (3 padding top and bottom, small text, 1px bottom border)
format!(
"height: calc((var(--spacing) * 3 * 2) + var(--text-sm--line-height) * var(--text-sm) + 1px + {height}px);"
)
});
let grid_col_style = SongListHeader::grid_col_style(&headers);
let col_count = headers.len();
view! {
<div class="w-full text-sm mb-3" node_ref=songlist_elem style=songlist_height_style>
{move || {
let (top_spacer_size, _song_ids_version, groups) = resources.get();
let groups_empty = groups.is_empty();
view! {
<div class="grid gap-x-3" style={format!("transform: translateY({top_spacer_size}px); {grid_col_style}")}>
{
(!groups_empty).then_some(
view! {
<SongListHeaders headers={&headers} top_spacer_size />
}
)
}
{
groups
.into_iter()
.map(|group: Self::Group| {
group.render(col_count, cols, play_callback)
}).collect::<Vec<_>>()
}
</div>
}
}}
</div>
}
}
fn display<F, V>(
self,
headers: Vec<SongListHeader>,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView
where
V: IntoView + 'static,
F: Fn(Self::RowArgs) -> V + Send + Sync + 'static;
}
impl DisplaySongList for Signal<Vec<i32>> {
type RowArgs = (usize, PlayCallback, frontend::Song);
type Group = SongGroup;
fn display<F, V>(
self,
headers: Vec<SongListHeader>,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView
where
V: IntoView + 'static,
F: Fn(Self::RowArgs) -> V + Send + Sync + 'static,
{
Self::display_inner::<_, _, _>(self, headers, cols, play_callback)
}
}
impl<T> DisplaySongList for Signal<Vec<(i32, T)>>
where
T: Send + Sync + 'static + Clone,
{
type RowArgs = (usize, PlayCallback, frontend::Song, T);
type Group = SongGroupExt<T>;
fn display<F, V>(
self,
headers: Vec<SongListHeader>,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView
where
V: IntoView + 'static,
F: Fn(Self::RowArgs) -> V + Send + Sync + 'static,
{
Self::display_inner::<_, _, _>(self, headers, cols, play_callback)
}
}
impl<T> DisplaySongList for Resource<BackendResult<Vec<T>>>
where
T: Send + Sync + Clone,
Signal<Vec<T>>: DisplaySongList,
{
type RowArgs = <Signal<Vec<T>> as DisplaySongList>::RowArgs;
type Group = <Signal<Vec<T>> as DisplaySongList>::Group;
fn display<F, V>(
self,
headers: Vec<SongListHeader>,
cols: StoredValue<F>,
play_callback: PlayCallback,
) -> impl IntoView
where
V: IntoView + 'static,
F: Fn(Self::RowArgs) -> V + Send + Sync + 'static,
{
let data_sig = RwSignal::new(vec![]);
let loading_sig = RwSignal::new(true);
let err_sig = RwSignal::new(None);
Effect::new(move |_| {
self.get().map(|song_ids| {
loading_sig.set(false);
match song_ids {
Ok(song_ids) => {
data_sig.set(song_ids);
err_sig.set(None);
}
Err(e) => {
data_sig.set(vec![]);
err_sig.set(Some(e));
}
}
})
});
let data_sig: Signal<_> = data_sig.into();
view! {
<Show
when=move || loading_sig.get()
>
<Loading />
</Show>
{move || {
err_sig.get().map(|error| error.to_component())
}}
{
data_sig.display(headers, cols, play_callback)
}
}
}
}
/// Basic play callback function. Simply clears the queue and pushes
/// only the current song into the queue. TODO This should be fixed to add
/// the entire song list to the queue, but this requires changes to
/// how the queue works, to use song ids instead of songs
fn play_callback(_list_index: usize, song: frontend::Song) {
let song_id = song.id;
GlobalState::play_status().update(move |status| {
if status.queue.front().map(|song| song.id) == Some(song_id) {
// If the clicked song is already at the front of the queue, just play it
status.playing = true;
} else {
// Otherwise, add the currently playing song to the history,
// clear the queue, and queue the clicked song and other after it
if let Some(last_playing) = status.queue.pop_front() {
status.history.push_back(last_playing);
}
status.queue.clear();
status.queue.push_back(song);
status.playing = true;
}
});
}
#[component]
pub fn SongList<S>(
song_ids: S,
#[prop(default = default_headers())] headers: Vec<SongListHeader>,
#[prop(default = default_song_list_content)] cols: fn(
(usize, PlayCallback, frontend::Song),
) -> AnyView,
) -> impl IntoView
where
S: DisplaySongList<RowArgs = (usize, PlayCallback, frontend::Song)>,
{
song_ids.display(headers, StoredValue::new(cols), play_callback.into())
}
#[component]
pub fn CustomSongList<S, F, V>(
song_ids: S,
#[prop(default = vec![])] song_list_header: Vec<SongListHeader>,
children: F,
) -> impl IntoView
where
S: DisplaySongList,
<S as DisplaySongList>::RowArgs: 'static,
V: IntoView + 'static,
F: Fn(<S as DisplaySongList>::RowArgs) -> V + Send + Sync + 'static,
{
song_ids.display(
song_list_header,
StoredValue::new(children),
play_callback.into(),
)
}

View File

@@ -0,0 +1,194 @@
use crate::prelude::*;
pub fn default_headers() -> Vec<SongListHeader> {
vec![
SongListHeader::new("#", "25px"),
SongListHeader::new("", format!("{SONG_IMAGE_SIZE}px")), // Art
SongListHeader::new("Title", "4fr"),
SongListHeader::new("Album", "3fr"),
SongListHeader::new("", "min-content"), // Like / Dislike
SongListHeader::new("Duration", "1fr"),
]
}
pub fn default_song_list_content(
(list_index, play_callback, song): (usize, PlayCallback, frontend::Song),
) -> AnyView {
let liked = RwSignal::new(
song.like_dislike
.as_ref()
.map(|(like, _)| *like)
.unwrap_or(false),
);
let disliked = RwSignal::new(
song.like_dislike
.as_ref()
.map(|(_, dislike)| *dislike)
.unwrap_or(false),
);
view! {
<SongListIndex list_index />
<SongImage
song={song.clone()}
song_playing={Signal::stored(false)}
list_index
play_callback
/>
<SongTitleArtists
title={song.title.clone()}
artists={song.artists.clone()}
/>
<SongAlbum
album={song.album.clone()}
/>
<SongLikeDislike
song_id={song.id}
liked
disliked
/>
<SongDuration
duration={song.duration}
/>
}
.into_any()
}
pub fn queue_headers() -> Vec<SongListHeader> {
vec![
SongListHeader::new("#", "25px"),
SongListHeader::new("", format!("{SONG_IMAGE_SIZE}px")), // Art
SongListHeader::new("Title", "4fr"),
SongListHeader::new("Duration", "1fr"),
]
}
pub fn queue_content(
(list_index, play_callback, song): (usize, PlayCallback, frontend::Song),
) -> AnyView {
view! {
<SongListIndex list_index />
<SongImage
song={song.clone()}
list_index
play_callback
/>
<SongTitleArtists
title={song.title.clone()}
artists={song.artists.clone()}
/>
<SongDuration
duration={song.duration}
/>
}
.into_any()
}
pub fn album_headers() -> Vec<SongListHeader> {
vec![
SongListHeader::new("#", "25px"),
SongListHeader::new("", format!("{SONG_IMAGE_SIZE}px")), // Art
SongListHeader::new("Title", "4fr"),
SongListHeader::new("", "min-content"), // Like / Dislike
SongListHeader::new("Duration", "1fr"),
]
}
pub fn album_content(
(list_index, play_callback, song): (usize, PlayCallback, frontend::Song),
) -> AnyView {
let liked = RwSignal::new(
song.like_dislike
.as_ref()
.map(|(like, _)| *like)
.unwrap_or(false),
);
let disliked = RwSignal::new(
song.like_dislike
.as_ref()
.map(|(_, dislike)| *dislike)
.unwrap_or(false),
);
view! {
<SongListIndex list_index />
<SongImage
song={song.clone()}
list_index
play_callback
/>
<SongTitleArtists
title={song.title.clone()}
artists={song.artists.clone()}
/>
<SongLikeDislike
song_id={song.id}
liked
disliked
/>
<SongDuration
duration={song.duration}
/>
}
.into_any()
}
#[component]
pub fn QueueSongList<S>(song_ids: S) -> impl IntoView
where
S: DisplaySongList<RowArgs = (usize, PlayCallback, frontend::Song)>,
{
view! {
<SongList
song_ids
headers={queue_headers()}
cols={queue_content}
/>
}
}
#[component]
pub fn AlbumSongList<S>(song_ids: S) -> impl IntoView
where
S: DisplaySongList<RowArgs = (usize, PlayCallback, frontend::Song)>,
{
view! {
<SongList
song_ids
headers={album_headers()}
cols={album_content}
/>
}
}
#[component]
pub fn PlaysSongList<S>(song_ids: S) -> impl IntoView
where
S: DisplaySongList<RowArgs = (usize, PlayCallback, frontend::Song, i64)>,
{
let mut headers = default_headers();
headers.push(SongListHeader::new("Plays", "1fr"));
view! {
<CustomSongList
song_ids
song_list_header={headers}
let:((list_index, play_callback, song, plays))
>
{default_song_list_content((list_index, play_callback, song))}
<SongPlays plays />
</CustomSongList>
}
}

View File

@@ -0,0 +1,64 @@
use crate::prelude::*;
#[slot]
pub struct SongListHeader {
#[prop(into)]
header: String,
#[prop(default = "2fr".to_owned())]
width: String,
}
impl SongListHeader {
pub fn new<S1: Into<String>, S2: Into<String>>(header: S1, width: S2) -> Self {
Self {
header: header.into(),
width: width.into(),
}
}
pub fn grid_col_style(headers: &[Self]) -> String {
let widths: String = headers
.iter()
.enumerate()
.map(|(i, header)| {
// Add additional size to the leftmost column for left hand padding
// See `SongRowWrapper` (pl-2)
// TODO better way to do this?
if i == 0 {
format!("calc(var(--spacing) * 2 + {})", header.width)
} else {
header.width.clone()
}
})
.intersperse(" ".to_owned())
.collect();
format!("grid-template-columns: {widths};")
}
}
#[component]
pub fn SongListHeaders<'a>(
headers: &'a [SongListHeader],
top_spacer_size: f64,
) -> impl IntoView + use<> {
let col_count = headers.len();
let grid_col_style = format!("grid-column: span {col_count} / span {col_count};");
view! {
<div class="sticky border-neutral-600 group p-3 grid
grid-cols-subgrid items-center z-100 border-b bg-bg-light text-neutral-400"
style={format!("{grid_col_style} top: calc((var(--spacing) * -4) - {top_spacer_size}px);")}
>
{
headers.iter().map(|header| view! {
<div>
{header.header.clone()}
</div>
}).collect::<Vec<_>>()
}
</div>
}
}

View File

@@ -1,245 +1,241 @@
use std::rc::Rc;
use leptos::leptos_dom::*;
use leptos::*;
use leptos_icons::*;
use leptos_router::Form;
use crate::prelude::*;
use leptos_router::components::Form;
use std::sync::Arc;
use web_sys::Response;
use crate::search::search_artists;
use crate::search::search_albums;
use crate::models::Artist;
use crate::models::Album;
#[component]
pub fn UploadBtn(dialog_open: RwSignal<bool>) -> impl IntoView {
let open_dialog = move |_| {
dialog_open.set(true);
};
let open_dialog = move |_| {
dialog_open.set(true);
};
view! {
<button class="upload-btn" on:click=open_dialog>
<div class="add-sign">
<Icon icon=icondata::IoAddSharp />
</div>
Upload
</button>
}
view! {
<button class="upload-btn add-btns" on:click=open_dialog>
Upload Song
</button>
}
}
#[component]
pub fn Upload(open: RwSignal<bool>) -> impl IntoView {
// Create signals for the artist input and the filtered artists
let (artists, set_artists) = create_signal("".to_string());
let (filtered_artists, set_filtered_artists) = create_signal(vec![]);
// Create signals for the artist input and the filtered artists
let (artists, set_artists) = signal("".to_string());
let (filtered_artists, set_filtered_artists) = signal(vec![]);
let (albums, set_albums) = create_signal("".to_string());
let (filtered_albums, set_filtered_albums) = create_signal(vec![]);
let (albums, set_albums) = signal("".to_string());
let (filtered_albums, set_filtered_albums) = signal(vec![]);
let (error_msg, set_error_msg) = create_signal::<Option<String>>(None);
let (error_msg, set_error_msg) = signal::<Option<String>>(None);
let close_dialog = move |ev: leptos::ev::MouseEvent| {
ev.prevent_default();
open.set(false);
};
// Create a filter function to handle filtering artists
// Allow users to search for artists by name, converts the artist name to artist id to be handed off to backend
let handle_filter_artists = move |ev: leptos::ev::Event| {
ev.prevent_default();
let close_dialog = move |ev: leptos::ev::MouseEvent| {
ev.prevent_default();
open.set(false);
};
// Create a filter function to handle filtering artists
// Allow users to search for artists by name, converts the artist name to artist id to be handed off to backend
let handle_filter_artists = move |ev: leptos::ev::Event| {
ev.prevent_default();
let artist_input: String = event_target_value(&ev);
let artist_input: String = event_target_value(&ev);
//Get the artist that we are currently searching for
let mut all_artists: Vec<&str> = artist_input.split(",").collect();
let search = all_artists.pop().unwrap().to_string();
//Update the artist signal with the input
set_artists.update(|value: &mut String| *value = artist_input);
//Get the artist that we are currently searching for
let mut all_artists: Vec<&str> = artist_input.split(",").collect();
let search = all_artists.pop().unwrap().to_string();
spawn_local(async move {
let filter_results = search_artists(search, 3).await;
if let Err(err) = filter_results {
log!("Error filtering artists: {:?}", err);
} else if let Ok(artists) = filter_results {
log!("Filtered artists: {:?}", artists);
//Update the artist signal with the input
set_artists.update(|value: &mut String| *value = artist_input);
set_filtered_artists.update(|value| *value = artists);
}
})
};
// Create a filter function to handle filtering albums
// Allow users to search for albums by title, converts the album title to album id to be handed off to backend
let handle_filter_albums = move |ev: leptos::ev::Event| {
ev.prevent_default();
spawn_local(async move {
let filter_results = api::search::search_artists(search, 3)
.await
.map(|results| results.into_iter().map(|(artist, _score)| artist).collect());
let album_input: String = event_target_value(&ev);
//Update the album signal with the input
set_albums.update(|value: &mut String| *value = album_input);
if let Err(err) = filter_results {
leptos_log!("Error filtering artists: {:?}", err);
} else if let Ok(artists) = filter_results {
leptos_log!("Filtered artists: {:?}", artists);
spawn_local(async move {
let filter_results = search_albums(albums.get_untracked(), 3).await;
if let Err(err) = filter_results {
log!("Error filtering albums: {:?}", err);
} else if let Ok(albums) = filter_results {
log!("Filtered albums: {:?}", albums);
set_filtered_albums.update(|value| *value = albums);
}
})
};
set_filtered_artists.update(|value| *value = artists);
}
});
};
// Create a filter function to handle filtering albums
// Allow users to search for albums by title, converts the album title to album id to be handed off to backend
let handle_filter_albums = move |ev: leptos::ev::Event| {
ev.prevent_default();
let handle_response = Rc::new(move |response: &Response| {
if response.ok() {
set_error_msg.update(|value| *value = None);
set_filtered_artists.update(|value| *value = vec![]);
set_filtered_albums.update(|value| *value = vec![]);
set_artists.update(|value| *value = "".to_string());
set_albums.update(|value| *value = "".to_string());
open.set(false);
} else {
// TODO: Extract error message from response
set_error_msg.update(|value| *value = Some("Error uploading song".to_string()));
}
});
let album_input: String = event_target_value(&ev);
view! {
<Show when=open fallback=move || view! {}>
<div class="upload-container" open=open>
<div class="close-button" on:click=close_dialog><Icon icon=icondata::IoClose /></div>
<div class="upload-header">
<h1>Upload Song</h1>
</div>
<Form action="/api/upload" method="POST" enctype=String::from("multipart/form-data")
class="upload-form" on_response=handle_response.clone()>
<div class="input-bx">
<input type="text" name="title" required class="text-input" required/>
<span>Title</span>
</div>
<div class="artists has-search">
<div class="input-bx">
<input type="text" name="artist_ids" class="text-input" prop:value=artists on:input=handle_filter_artists/>
<span>Artists</span>
</div>
<Show
when=move || {filtered_artists.get().len() > 0}
fallback=move || view! {}
>
<ul class="artist_results search-results">
{
move || filtered_artists.get().iter().enumerate().map(|(_index,filtered_artist)| view! {
<Artist artist=filtered_artist.clone() artists=artists set_artists=set_artists set_filtered=set_filtered_artists/>
}).collect::<Vec<_>>()
}
</ul>
</Show>
</div>
<div class="albums has-search">
<div class="input-bx">
<input type="text" name="album_id" class="text-input" prop:value=albums on:input=handle_filter_albums/>
<span>Album ID</span>
</div>
<Show
when=move || {filtered_albums.get().len() > 0}
fallback=move || view! {}
>
<ul class="album_results search-results">
{
move || filtered_albums.get().iter().enumerate().map(|(_index,filtered_album)| view! {
<Album album=filtered_album.clone() _albums=albums set_albums=set_albums set_filtered=set_filtered_albums/>
}).collect::<Vec<_>>()
}
</ul>
</Show>
</div>
<div class="input-bx">
<input type="number" name="track_number" class="text-input"/>
<span>Track Number</span>
</div>
<div class="release-date">
<div class="left">
<span>Release</span>
<span>Date</span>
</div>
<input class="info" type="date" name="release_date"/>
</div>
<div class="file">
<span>File</span>
<input class="info" type="file" accept=".mp3" name="file" required/>
</div>
<button type="submit" class="upload-button">Upload</button>
</Form>
<Show
when=move || {error_msg.get().is_some()}
fallback=move || view! {}
>
<div class="error-msg">
<Icon icon=icondata::IoAlertCircleSharp />
{error_msg.get().as_ref().unwrap()}
</div>
</Show>
</div>
</Show>
}
//Update the album signal with the input
set_albums.update(|value: &mut String| *value = album_input);
spawn_local(async move {
let filter_results = api::search::search_albums(albums.get_untracked(), 3)
.await
.map(|results| results.into_iter().map(|(album, _score)| album).collect());
if let Err(err) = filter_results {
leptos_log!("Error filtering albums: {:?}", err);
} else if let Ok(albums) = filter_results {
leptos_log!("Filtered albums: {:?}", albums);
set_filtered_albums.update(|value| *value = albums);
}
});
};
let handle_response = Arc::new(move |response: &Response| {
if response.ok() {
set_error_msg.update(|value| *value = None);
set_filtered_artists.update(|value| *value = vec![]);
set_filtered_albums.update(|value| *value = vec![]);
set_artists.update(|value| *value = "".to_string());
set_albums.update(|value| *value = "".to_string());
open.set(false);
} else {
// TODO: Extract error message from response
set_error_msg.update(|value| *value = Some("Error uploading song".to_string()));
}
});
view! {
<Show when=open fallback=move || view! {}>
<dialog class="upload-container" open=open>
<div class="close-button" on:click=close_dialog><Icon icon={icondata::IoClose} /></div>
<div class="upload-header">
<h1>Upload Song</h1>
</div>
<Form action="/api/upload" method="POST" enctype=String::from("multipart/form-data")
on_response=handle_response.clone() {..} class="upload-form" >
<div class="input-bx">
<input type="text" name="title" required class="text-input" required/>
<span>Title</span>
</div>
<div class="artists has-search">
<div class="input-bx">
<input type="text" name="artist_ids" class="text-input" prop:value=artists on:input=handle_filter_artists/>
<span>Artists</span>
</div>
<Show
when=move || {!filtered_artists.get().is_empty()}
fallback=move || view! {}
>
<ul class="artist_results search-results">
{
move || filtered_artists.get().iter().map(|filtered_artist| view! {
<Artist artist=filtered_artist.clone() artists=artists set_artists=set_artists set_filtered=set_filtered_artists/>
}).collect::<Vec<_>>()
}
</ul>
</Show>
</div>
<div class="albums has-search">
<div class="input-bx">
<input type="text" name="album_id" class="text-input" prop:value=albums on:input=handle_filter_albums/>
<span>Album ID</span>
</div>
<Show
when=move || {!filtered_albums.get().is_empty()}
fallback=move || view! {}
>
<ul class="album_results search-results">
{
move || filtered_albums.get().iter().map(|filtered_album| view! {
<Album album=filtered_album.clone() _albums=albums set_albums=set_albums set_filtered=set_filtered_albums/>
}).collect::<Vec<_>>()
}
</ul>
</Show>
</div>
<div class="input-bx">
<input type="number" name="track_number" class="text-input"/>
<span>Track Number</span>
</div>
<div class="release-date">
<div class="left">
<span>Release</span>
<span>Date</span>
</div>
<input class="info" type="date" name="release_date"/>
</div>
<div class="file">
<span>File</span>
<input class="info" type="file" accept=".mp3" name="file" required/>
</div>
<button type="submit" class="upload-button">Upload</button>
</Form>
<Show
when=move || {error_msg.get().is_some()}
fallback=move || view! {}
>
<div class="error-msg">
<Icon icon={icondata::IoAlertCircleSharp} />
{error_msg.get().unwrap()}
</div>
</Show>
</dialog>
</Show>
}
}
#[component]
pub fn Artist(artist: Artist, artists: ReadSignal<String>, set_artists: WriteSignal<String>, set_filtered: WriteSignal<Vec<Artist>>) -> impl IntoView {
// Converts artist name to artist id and adds it to the artist input
let add_artist = move |_| {
//Create an empty string to hold previous artist ids
let mut s: String = String::from("");
//Get the current artist input
let all_artirts: String = artists.get();
//Split the input into a vector of artists separated by commas
let mut ids: Vec<&str> = all_artirts.split(",").collect();
//If there is only one artist in the input, get their id equivalent and add it to the string
if ids.len() == 1 {
let value_str = match artist.id.clone() {
Some(v) => v.to_string(),
None => String::from("None"),
};
s.push_str(&value_str);
s.push_str(",");
set_artists.update(|value| *value = s);
//If there are multiple artists in the input, pop the last artist by string off the vector,
//get their id equivalent, and add it to the string
} else {
ids.pop();
for id in ids {
s.push_str(id);
s.push_str(",");
}
let value_str = match artist.id.clone() {
Some(v) => v.to_string(),
None => String::from("None"),
};
s.push_str(&value_str);
s.push_str(",");
set_artists.update(|value| *value = s);
}
//Clear the search results
set_filtered.update(|value| *value = vec![]);
};
pub fn Artist(
artist: frontend::Artist,
artists: ReadSignal<String>,
set_artists: WriteSignal<String>,
set_filtered: WriteSignal<Vec<frontend::Artist>>,
) -> impl IntoView {
// Converts artist name to artist id and adds it to the artist input
let add_artist = move |_| {
//Create an empty string to hold previous artist ids
let mut s: String = String::from("");
//Get the current artist input
let all_artirts: String = artists.get();
//Split the input into a vector of artists separated by commas
let mut ids: Vec<&str> = all_artirts.split(",").collect();
//If there is only one artist in the input, get their id equivalent and add it to the string
if ids.len() == 1 {
s.push_str(&artist.id.to_string());
s.push(',');
set_artists.update(|value| *value = s);
//If there are multiple artists in the input, pop the last artist by string off the vector,
//get their id equivalent, and add it to the string
} else {
ids.pop();
for id in ids {
s.push_str(id);
s.push(',');
}
s.push_str(&artist.id.to_string());
s.push(',');
set_artists.update(|value| *value = s);
}
//Clear the search results
set_filtered.update(|value| *value = vec![]);
};
view! {
<div class="artist result" on:click=add_artist>
{artist.name.clone()}
</div>
}
view! {
<div class="artist result" on:click=add_artist>
{artist.name.clone()}
</div>
}
}
#[component]
pub fn Album(album: Album, _albums: ReadSignal<String>, set_albums: WriteSignal<String>, set_filtered: WriteSignal<Vec<Album>>) -> impl IntoView {
//Converts album title to album id to upload a song
let add_album = move |_| {
let value_str = match album.id.clone() {
Some(v) => v.to_string(),
None => String::from("None"),
};
set_albums.update(|value| *value = value_str);
set_filtered.update(|value| *value = vec![]);
};
view! {
<div class="album result" on:click=add_album>
{album.title.clone()}
</div>
}
}
pub fn Album(
album: frontend::Album,
_albums: ReadSignal<String>,
set_albums: WriteSignal<String>,
set_filtered: WriteSignal<Vec<frontend::Album>>,
) -> impl IntoView {
//Converts album title to album id to upload a song
let add_album = move |_| {
set_albums.update(|value| *value = album.id.to_string());
set_filtered.update(|value| *value = vec![]);
};
view! {
<div class="album result" on:click=add_album>
{album.title.clone()}
</div>
}
}

View File

@@ -0,0 +1,31 @@
use crate::prelude::*;
#[component]
pub fn UploadDropdownBtn(dropdown_open: RwSignal<bool>) -> impl IntoView {
let open_dropdown = move |_| {
dropdown_open.set(!dropdown_open.get());
};
view! {
<button class={move || if dropdown_open() {"upload-dropdown-btn upload-dropdown-btn-active"} else {"upload-dropdown-btn"}} on:click=open_dropdown>
<div class="add-sign">
<Icon icon={icondata::IoAddSharp} />
</div>
</button>
}
}
#[component]
pub fn UploadDropdown(
dropdown_open: RwSignal<bool>,
upload_open: RwSignal<bool>,
add_artist_open: RwSignal<bool>,
add_album_open: RwSignal<bool>,
) -> impl IntoView {
view! {
<div class="upload-dropdown" on:click=move |_| dropdown_open.set(false)>
<UploadBtn dialog_open=upload_open />
<AddArtistBtn add_artist_open=add_artist_open/>
<AddAlbumBtn add_album_open=add_album_open/>
</div>
}
}

View File

@@ -1,115 +0,0 @@
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use leptos::logging::log;
use lazy_static::lazy_static;
use std::env;
use diesel::{
pg::PgConnection,
r2d2::ConnectionManager,
r2d2::PooledConnection,
r2d2::Pool,
};
use diesel_migrations::{
embed_migrations,
EmbeddedMigrations,
MigrationHarness,
};
// See https://leward.eu/notes-on-diesel-a-rust-orm/
// Define some types to make it easier to work with Diesel
type PgPool = Pool<ConnectionManager<PgConnection>>;
pub type PgPooledConn = PooledConnection<ConnectionManager<PgConnection>>;
// Keep a global instance of the pool
lazy_static! {
static ref DB_POOL: PgPool = init_db_pool();
}
/// Initialize the database pool
///
/// Uses DATABASE_URL environment variable to connect to the database if set,
/// otherwise builds a connection string from other environment variables.
///
/// Will panic if either the DATABASE_URL or POSTGRES_HOST environment variables
/// are not set, or if there is an error creating the pool.
///
/// # Returns
/// A database pool object, which can be used to get pooled connections
fn init_db_pool() -> PgPool {
let database_url = env::var("DATABASE_URL").unwrap_or_else(|_| {
// Build the database URL from environment variables
// Construct a separate log_url to avoid logging the password
let mut log_url = "postgres://".to_string();
let mut url = "postgres://".to_string();
if let Ok(user) = env::var("POSTGRES_USER") {
url.push_str(&user);
log_url.push_str(&user);
if let Ok(password) = env::var("POSTGRES_PASSWORD") {
url.push_str(":");
log_url.push_str(":");
url.push_str(&password);
log_url.push_str("********");
}
url.push_str("@");
log_url.push_str("@");
}
let host = env::var("POSTGRES_HOST").expect("DATABASE_URL or POSTGRES_HOST must be set");
url.push_str(&host);
log_url.push_str(&host);
if let Ok(port) = env::var("POSTGRES_PORT") {
url.push_str(":");
url.push_str(&port);
log_url.push_str(":");
log_url.push_str(&port);
}
if let Ok(dbname) = env::var("POSTGRES_DB") {
url.push_str("/");
url.push_str(&dbname);
log_url.push_str("/");
log_url.push_str(&dbname);
}
log!("Connecting to database: {}", log_url);
url
});
let manager = ConnectionManager::<PgConnection>::new(database_url);
PgPool::builder()
.build(manager)
.expect("Failed to create pool.")
}
/// Get a pooled connection to the database
///
/// Will panic if there is an error getting a connection from the pool.
///
/// # Returns
/// A pooled connection to the database
pub fn get_db_conn() -> PgPooledConn {
DB_POOL.get().expect("Failed to get a database connection from the pool.")
}
/// Embedded database migrations into the binary
const DB_MIGRATIONS: EmbeddedMigrations = embed_migrations!();
/// Run any pending migrations in the database
/// Always safe to call, as it will only run migrations that have not already been run
pub fn migrate() {
let db_con = &mut get_db_conn();
db_con.run_pending_migrations(DB_MIGRATIONS).expect("Could not run database migrations");
}
}
}

View File

@@ -1,70 +0,0 @@
use cfg_if::cfg_if;
cfg_if! { if #[cfg(feature = "ssr")] {
use axum::{
body::Body,
extract::State,
response::IntoResponse,
http::{Request, Response, StatusCode, Uri},
};
use axum::response::Response as AxumResponse;
use tower::ServiceExt;
use tower_http::services::ServeDir;
use leptos::*;
use crate::app::App;
use std::str::FromStr;
pub async fn file_and_error_handler(uri: Uri, State(options): State<LeptosOptions>, req: Request<Body>) -> AxumResponse {
let root = options.site_root.clone();
let res = get_static_file(uri.clone(), &root).await.unwrap();
if res.status() == StatusCode::OK {
res.into_response()
} else {
let handler = leptos_axum::render_app_to_stream(options.to_owned(), App);
handler(req).await.into_response()
}
}
pub async fn get_static_file(uri: Uri, root: &str) -> Result<Response<Body>, (StatusCode, String)> {
let req = Request::builder().uri(uri.clone()).body(Body::empty()).unwrap();
// `ServeDir` implements `tower::Service` so we can call it with `tower::ServiceExt::oneshot`
// This path is relative to the cargo root
match ServeDir::new(root).oneshot(req).await.ok() {
Some(res) => Ok(res.into_response()),
None => Err((
StatusCode::INTERNAL_SERVER_ERROR,
format!("Something went wrong"),
)),
}
}
pub enum AssetType {
Audio,
Image,
}
pub async fn get_asset_file(filename: String, asset_type: AssetType) -> Result<Response<Body>, (StatusCode, String)> {
const DEFAULT_AUDIO_PATH: &str = "assets/audio";
const DEFAULT_IMAGE_PATH: &str = "assets/images";
let root = match asset_type {
AssetType::Audio => std::env::var("LIBRETUNES_AUDIO_PATH").unwrap_or(DEFAULT_AUDIO_PATH.to_string()),
AssetType::Image => std::env::var("LIBRETUNES_IMAGE_PATH").unwrap_or(DEFAULT_IMAGE_PATH.to_string()),
};
// Create a Uri from the filename
// ServeDir expects a leading `/`
let uri = Uri::from_str(format!("/{}", filename).as_str());
match uri {
Ok(uri) => get_static_file(uri, root.as_str()).await,
Err(_) => Err((
StatusCode::INTERNAL_SERVER_ERROR,
format!("Attempted to serve an invalid file"),
)),
}
}
}}

21
src/health.rs Normal file
View File

@@ -0,0 +1,21 @@
use libretunes::api::health::health;
use server_fn::client::set_server_url;
#[tokio::main]
pub async fn main() {
let host = std::env::args()
.nth(1)
.unwrap_or("http://localhost:3000".to_string());
println!("Runing health check against {host}");
set_server_url(Box::leak(host.into_boxed_str()));
match health().await {
Ok(result) => println!("Health check result: {result:?}"),
Err(err) => {
println!("Error: {err}");
std::process::exit(1);
}
}
}

143
src/ingest/create.rs Normal file
View File

@@ -0,0 +1,143 @@
use crate::prelude::*;
use image_convert::ImageResource;
use std::fs;
use std::path::Path;
pub fn create_artist(
mut artist: backend::NewArtist,
image: &Option<ImageResource>,
db_conn: &mut PgPooledConn,
image_base_path: &Path,
) -> BackendResult<backend::Artist> {
let image_save_path = image
.as_ref()
.map(|image| save_image(image, image_base_path))
.transpose()
.context("Error saving artist image")?;
artist.image_path = image_save_path;
let new_artist = diesel::insert_into(artists::table)
.values(&artist)
.get_result(db_conn)
.context(format!(
"Error inserting new artist \"{}\" into database",
artist.name
))?;
Ok(new_artist)
}
pub fn create_album(
mut album: backend::NewAlbum,
artists: Vec<i32>,
image: &Option<ImageResource>,
db_conn: &mut PgPooledConn,
image_base_path: &Path,
) -> BackendResult<backend::Album> {
let image_save_path = image
.as_ref()
.map(|image| save_image(image, image_base_path))
.transpose()
.context("Error saving album image")?;
album.image_path = image_save_path;
let new_album = db_conn
.transaction(|db_conn| {
let new_album: backend::Album = diesel::insert_into(albums::table)
.values(&album)
.get_result(db_conn)
.context(format!(
"Error inserting new album \"{}\" into database",
album.title
))?;
for artist_id in artists {
diesel::insert_into(album_artists::table)
.values((
album_artists::album_id.eq(new_album.id),
album_artists::artist_id.eq(artist_id),
))
.execute(db_conn)
.context(format!("Error adding artist {artist_id} to album"))?;
}
Ok::<backend::Album, BackendError>(new_album)
})
.context(format!(
"Error running database transaction for album \"{}\"",
album.title
))?;
Ok(new_album)
}
pub fn create_song(
mut song: backend::NewSong,
artists: Vec<i32>,
image: Option<ImageResource>,
db_conn: &mut PgPooledConn,
image_base_path: &Path,
) -> BackendResult<backend::Song> {
let image_save_path = image
.as_ref()
.map(|image| save_image(image, image_base_path))
.transpose()
.context("Error saving song image")?;
song.image_path = image_save_path;
let new_song = db_conn
.transaction(|db_conn| {
let new_song: backend::Song = diesel::insert_into(songs::table)
.values(&song)
.get_result(db_conn)
.context("Error inserting new song \"{}\" into database")?;
for artist_id in artists {
diesel::insert_into(song_artists::table)
.values((
song_artists::song_id.eq(new_song.id),
song_artists::artist_id.eq(artist_id),
))
.execute(db_conn)
.context(format!("Error adding artist {artist_id} to song"))?;
}
Ok::<backend::Song, BackendError>(new_song)
})
.context(format!(
"Error running database transaction for song \"{}\"",
song.title
))?;
Ok(new_song)
}
pub fn save_image(image: &ImageResource, image_base_path: &Path) -> BackendResult<LocalPath> {
let relative_path = AssetType::Image.new_path("webp");
let full_path = image_base_path.join(relative_path.clone().path());
let parent_path = full_path
.parent()
.ok_or(BackendError::InternalError(format!(
"Unable to get parent of path \"{}\"",
full_path.display()
)))?;
fs::create_dir_all(parent_path).context(format!(
"Failed to create parent directories for new user image at \"{}\"",
parent_path.display()
))?;
let mut image_target = ImageResource::from_path(&full_path);
image_convert::to_webp(&mut image_target, image, &image_convert::WEBPConfig::new())
.map_err(|e| InputError::InvalidInput(format!("{e}")))
.context("Error saving image as webp")?;
Ok(relative_path)
}

48
src/ingest/find.rs Normal file
View File

@@ -0,0 +1,48 @@
use crate::prelude::*;
pub fn find_artist(
name: String,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::Artist>> {
artists::table
.filter(artists::name.eq(&name))
.first(db_conn)
.optional()
.context(format!("Error finding artist \"{name}\""))
}
pub fn find_album(
title: String,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::Album>> {
albums::table
.filter(albums::title.eq(&title))
.first(db_conn)
.optional()
.context(format!("Error finding album \"{title}\""))
}
pub fn find_song(
song: backend::NewSong,
db_conn: &mut PgPooledConn,
) -> BackendResult<Option<backend::Song>> {
songs::table
.filter(songs::title.eq(&song.title))
.filter(songs::album_id.eq(&song.album_id))
.first(db_conn)
.optional()
.context(format!("Error finding song \"{}\"", song.title))
}
pub fn find_song_from_file(path: LocalPath, db_conn: &mut PgPooledConn) -> BackendResult<bool> {
use diesel::dsl::{exists, select};
select(exists(
songs::table.filter(songs::storage_path.eq(path.clone())),
))
.get_result(db_conn)
.context(format!(
"Error checking if file \"{}\" exists in the song database",
path.path().display()
))
}

4
src/ingest/mod.rs Normal file
View File

@@ -0,0 +1,4 @@
pub mod create;
pub mod find;
pub mod scan;
pub mod task;

327
src/ingest/scan.rs Normal file
View File

@@ -0,0 +1,327 @@
use crate::ingest::create::*;
use crate::ingest::find::*;
use crate::prelude::*;
use image_convert::ImageResource;
use std::path::{Path, PathBuf};
pub fn full_scan(state: &BackendState) {
info!("Ingest running...");
let ingest_path = state.config.audio_path.clone();
if let Err(e) = process_dir(ingest_path, state) {
error!("Error scanning for audio ingest: {e}");
}
}
pub fn process_dir(path: PathBuf, state: &BackendState) -> BackendResult<()> {
debug!("Scanning directory {} for audio files...", path.display());
let contents = path.read_dir().context("Error reading directory")?;
for dir_entry in contents {
let dir_entry = match dir_entry {
Ok(dir_entry) => dir_entry,
Err(e) => {
warn!(
"Error getting directory entry of {}: {e}, skipping...",
path.display()
);
continue;
}
};
let entry_type = match dir_entry.file_type() {
Ok(entry_type) => entry_type,
Err(e) => {
warn!(
"Error getting directory entry type of file {}: {e}, skipping...",
dir_entry.path().display()
);
continue;
}
};
if entry_type.is_dir() {
if let Err(e) = process_dir(dir_entry.path(), state) {
warn!(
"Failed to process directory {}: {e}, skipping...",
dir_entry.path().display()
);
}
} else if entry_type.is_symlink() {
if let Err(e) = process_link(dir_entry.path(), state) {
warn!(
"Failed to process symlink {}: {e}, skipping...",
dir_entry.path().display()
);
}
} else if entry_type.is_file() {
if let Err(e) = process_file(dir_entry.path(), state) {
warn!(
"Failed to process file {}: {e}, skipping...",
dir_entry.path().display()
);
}
} else {
unreachable!("One of is_file, is_dir, or is_symlink must be true")
}
}
Ok(())
}
pub fn process_link(path: PathBuf, state: &BackendState) -> BackendResult<()> {
debug!("Processing symlink {}...", path.display());
let destination = path.read_link().context("Failed to follow symlink")?;
if destination.is_file() {
process_file(destination.clone(), state).context(format!(
"Failed to process file {} pointed to by symlink {}",
destination.display(),
path.display()
))
} else if destination.is_dir() {
process_dir(destination.clone(), state).context(format!(
"Failed to process directory {} pointed to by symlink {}",
destination.display(),
path.display()
))
} else if destination.is_symlink() {
process_link(destination.clone(), state).context(format!(
"Failed to process symlink {} pointed to by symlink {}",
destination.display(),
path.display()
))
} else {
unreachable!("One of is_file, is_dir, or is_symlink must be true")
}
}
pub fn process_file(path: PathBuf, state: &BackendState) -> BackendResult<()> {
debug!("Processing file {}...", path.display());
let stripped_path = LocalPath::from_file_path(path.clone(), AssetType::Audio, state)
.context("Error stripping file path")?;
let mut db_conn = state.get_db_conn()?;
let song_ingested = find_song_from_file(stripped_path.clone(), &mut db_conn)
.context("Error checking if song file already ingested")?;
// Check if path exists
if song_ingested {
return Ok(());
}
// Read exif data
let config = audiotags::config::Config {
sep_artist: "\0",
parse_multiple_artists: true,
};
let tag = audiotags::Tag::new()
.with_config(config)
.read_from_path(&path)
.context(format!(
"Error reading audio tags for file {}",
path.display()
))?;
let title = tag.title().ok_or(InputError::MissingField(format!(
"No title tag in file {}",
path.display()
)))?;
let artists = tag.artists().unwrap_or(vec![]);
let album_name = tag.album().map(|a| a.title);
let album_artists = tag.album_artists();
let track = tag.track_number().map(|track| track as i32);
let album_cover = tag.album_cover();
let release_date = tag
.date()
.map(|ts| {
NaiveDate::from_ymd_opt(
ts.year,
ts.month.unwrap_or(1) as u32,
ts.day.unwrap_or(1) as u32,
)
.ok_or(BackendError::InternalError(format!(
"Error creating NaiveDate from metadata timestamp {ts:?}"
)))
})
.transpose()?;
let duration = tag.duration().map(|d| Ok(d as u64)).unwrap_or_else(|| {
let file = std::fs::File::open(&path).context(format!(
"Error opening audio file {} to read duration",
path.display()
))?;
crate::util::audio::extract_metadata(file)
.map(|(_codec, duration)| duration)
.context(format!(
"Error extracting duration from audio file {}",
path.display()
))
})?;
let duration = i32::try_from(duration)
.map_err(|_| {
BackendError::InternalError(format!("u64 {duration} can't be represented as i32"))
})
.context("Error converting song duration")?;
let image_base_path = state.get_asset_path(&AssetType::Image);
db_conn
.transaction(|db_conn| {
let song_artist_ids = artists
.into_iter()
.map(|artist_name| {
find_or_create_artist(artist_name.to_string(), &None, db_conn, &image_base_path)
.map(|artist| artist.id)
})
.collect::<BackendResult<Vec<i32>>>()
.context(format!("Error with artists for new song {title}"))?;
let album_cover_resource = album_cover
.map(|album_cover| {
let cursor = std::io::Cursor::new(album_cover.data);
image_convert::ImageResource::from_reader(cursor)
})
.transpose()
.context("Error setting up reader for album cover")?;
let song_album = album_name
.map(|album_name| {
find_or_create_album(
album_name.to_string(),
&album_cover_resource,
album_artists.unwrap_or(vec![]),
db_conn,
&image_base_path,
)
.context(format!("Error finding/creating album for new song {title}"))
})
.transpose()
.context(format!("Error with album for new song {title}"))?;
let proto_song = backend::NewSong {
title: title.to_string(),
album_id: song_album.map(|album| album.id),
track,
duration,
release_date,
storage_path: stripped_path,
image_path: None,
};
let found_song =
find_song(proto_song.clone(), db_conn).context("Error finding song for ingest")?;
if let Some(song) = found_song {
info!(
"Found song {} instead of ingesting {}",
song.title,
path.display()
);
return Ok(());
}
create_song(
proto_song.clone(),
song_artist_ids,
None,
db_conn,
&image_base_path,
)
.context(format!("Error creating ingested song {}", proto_song.title))?;
Ok::<(), BackendError>(())
})
.context(format!(
"Error running database transaction to ingest song from file {}",
path.display()
))
}
fn find_or_create_artist(
artist_name: String,
artist_image: &Option<ImageResource>,
db_conn: &mut PgPooledConn,
image_base_path: &Path,
) -> BackendResult<backend::Artist> {
db_conn
.transaction(|db_conn| {
find_artist(artist_name.clone(), db_conn)
.context(format!("Error trying to find artist {artist_name}"))?
.map_or_else(
|| {
let new_artist = backend::NewArtist {
name: artist_name.to_string(),
image_path: None,
};
create_artist(new_artist, artist_image, db_conn, image_base_path)
.context(format!("Error creating artist {artist_name}"))
},
Result::Ok,
)
})
.context(format!(
"Error running database transaction to find or create artist {artist_name}"
))
}
fn find_or_create_album(
album_name: String,
album_image: &Option<ImageResource>,
artists: Vec<&str>,
db_conn: &mut PgPooledConn,
image_base_path: &Path,
) -> BackendResult<backend::Album> {
db_conn
.transaction(|db_conn| {
find_album(album_name.clone(), db_conn)
.context(format!("Error trying to find album {album_name}"))?
.map_or_else(
|| {
let new_album = backend::NewAlbum {
title: album_name.to_string(),
release_date: None,
image_path: None,
};
let artists: Vec<i32> = artists
.into_iter()
.map(|artist_name| {
find_or_create_artist(
artist_name.to_string(),
&None,
db_conn,
image_base_path,
)
.map(|artist| artist.id)
})
.collect::<BackendResult<Vec<i32>>>()
.context(format!(
"Error finding/creating artists for album {album_name}"
))?;
create_album(new_album, artists, album_image, db_conn, image_base_path)
.context("Error creating artist")
},
Result::Ok,
)
})
.context(format!(
"Error running database transaction to find or create album {album_name}"
))
}

34
src/ingest/task.rs Normal file
View File

@@ -0,0 +1,34 @@
use crate::ingest::scan::full_scan;
use crate::prelude::*;
use tokio::task::{JoinHandle, spawn, spawn_blocking};
use tokio::time::{Duration, Instant, interval_at};
pub const INITIAL_SCAN_DELAY: Duration = Duration::from_secs(10);
pub const SCAN_INTERVAL: Duration = Duration::from_hours(1);
/// Start the ingest task
/// Waits an initial delay for startup to complete, then runs a full ingest
/// scan on a regular interval
pub fn start_task(state: BackendState) -> JoinHandle<!> {
info!("Starting ingest task...");
let start_time = Instant::now() + INITIAL_SCAN_DELAY;
let mut scan_interval = interval_at(start_time, SCAN_INTERVAL);
spawn(async move {
loop {
scan_interval.tick().await;
let state = state.clone();
let scan_handle = spawn_blocking(move || {
full_scan(&state);
});
if let Err(e) = scan_handle.await {
error!("Ingest scan panicked: {e}");
}
}
})
}

View File

@@ -1,30 +1,51 @@
#![warn(
unsafe_code,
clippy::cognitive_complexity,
clippy::dbg_macro,
clippy::debug_assert_with_mut_call,
clippy::doc_link_with_quotes,
clippy::doc_markdown,
clippy::empty_line_after_outer_attr,
clippy::float_cmp,
clippy::float_cmp_const,
clippy::float_equality_without_abs,
keyword_idents,
clippy::missing_const_for_fn,
non_ascii_idents,
noop_method_call,
clippy::print_stderr,
clippy::print_stdout,
clippy::semicolon_if_nothing_returned,
clippy::unseparated_literal_suffix,
clippy::suspicious_operation_groupings,
unused_import_braces,
clippy::unused_self,
clippy::use_debug,
clippy::useless_let_if_seq,
clippy::wildcard_dependencies
)]
#![allow(clippy::unused_unit, clippy::unit_arg, clippy::type_complexity)]
#![recursion_limit = "256"]
#![feature(duration_constructors)]
#![feature(never_type)]
#![feature(adt_const_params)]
#![feature(iter_intersperse)]
pub mod api;
pub mod app;
pub mod auth;
pub mod songdata;
pub mod albumdata;
pub mod artistdata;
pub mod playstatus;
pub mod playbar;
pub mod database;
pub mod queue;
pub mod song;
pub mod components;
pub mod models;
pub mod pages;
pub mod components;
pub mod users;
pub mod search;
pub mod fileserv;
pub mod error_template;
pub mod api;
pub mod upload;
pub mod util;
pub use util::prelude;
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
pub mod auth_backend;
pub mod schema;
pub mod ingest;
}
}
@@ -39,7 +60,7 @@ if #[cfg(feature = "hydrate")] {
console_error_panic_hook::set_once();
leptos::mount_to_body(App);
leptos::mount::hydrate_body(App);
}
}
}

View File

@@ -1,12 +1,4 @@
// Needed for building in Docker container
// See https://github.com/clux/muslrust?tab=readme-ov-file#diesel-and-pq-builds
// See https://github.com/sgrif/pq-sys/issues/25
#[cfg(target_env = "musl")]
extern crate openssl;
#[cfg(target_env = "musl")]
#[macro_use]
extern crate diesel;
#![recursion_limit = "256"]
#[cfg(feature = "ssr")]
extern crate diesel_migrations;
@@ -14,21 +6,39 @@ extern crate diesel_migrations;
#[cfg(feature = "ssr")]
#[tokio::main]
async fn main() {
use axum::{routing::get, Router, extract::Path, middleware::from_fn};
use leptos::*;
use leptos_axum::{generate_route_list, LeptosRoutes};
use libretunes::app::*;
use libretunes::util::require_auth::require_auth_middleware;
use libretunes::fileserv::{file_and_error_handler, get_asset_file, get_static_file, AssetType};
use axum_login::tower_sessions::SessionManagerLayer;
use tower_sessions_redis_store::{fred::prelude::*, RedisStore};
use axum::{Router, middleware::from_fn};
use axum::{body::Body, extract::Request, http::Response, middleware::Next};
use axum_login::AuthManagerLayerBuilder;
use libretunes::auth_backend::AuthBackend;
use log::*;
use axum_login::tower_sessions::SessionManagerLayer;
use http::StatusCode;
use leptos_axum::{LeptosRoutes, file_and_error_handler, generate_route_list};
use libretunes::app::*;
use libretunes::prelude::*;
use libretunes::util::config::load_config;
use libretunes::util::require_auth::require_auth_middleware;
use tower_http::{compression::CompressionLayer, services::ServeDir};
use tower_sessions_redis_store::RedisStore;
flexi_logger::Logger::try_with_env_or_str("debug").unwrap().format(flexi_logger::opt_format).start().unwrap();
flexi_logger::Logger::try_with_env_or_str("debug")
.unwrap()
.format(flexi_logger::opt_format)
.start()
.unwrap();
info!("\n{}", include_str!("../ascii_art.txt"));
let config = load_config().unwrap_or_else(|err| {
error!("Failed to load configuration: {}", err);
std::process::exit(1);
});
let state = BackendState::from_config(config)
.await
.unwrap_or_else(|err| {
error!("Failed to initialize backend state: {}", err);
std::process::exit(1);
});
info!("Starting Leptos server...");
use dotenvy::dotenv;
@@ -36,44 +46,75 @@ async fn main() {
debug!("Running database migrations...");
let mut db_conn = state.get_db_conn().unwrap_or_else(|err| {
error!("Failed to get database connection: {}", err);
std::process::exit(1);
});
// Bring the database up to date
libretunes::database::migrate();
libretunes::util::database::migrate(&mut db_conn);
drop(db_conn); // Close the connection after migrations
debug!("Connecting to Redis...");
// Create a task to periodically run ingest
if !state.config.disable_ingest {
let _ingest_task = libretunes::ingest::task::start_task(state.clone()).await;
}
let redis_url = std::env::var("REDIS_URL").expect("REDIS_URL must be set");
let redis_config = RedisConfig::from_url(&redis_url).expect(&format!("Unable to parse Redis URL: {}", redis_url));
let redis_pool = RedisPool::new(redis_config, None, None, None, 1).expect("Unable to create Redis pool");
redis_pool.connect();
redis_pool.wait_for_connect().await.expect("Unable to connect to Redis");
let session_store = RedisStore::new(redis_pool);
debug!("Setting up session store...");
let session_store = RedisStore::new(state.get_redis_conn());
let session_layer = SessionManagerLayer::new(session_store);
let auth_backend = AuthBackend;
let auth_backend = AuthBackend {
backend_state: state.clone(),
};
let auth_layer = AuthManagerLayerBuilder::new(auth_backend, session_layer).build();
let conf = get_configuration(None).await.unwrap();
let audio_path = state.config.audio_path.clone();
let image_path = state.config.image_path.clone();
// A middleware that injects the backend state into the request extensions,
// allowing it to be extracted later in the request lifecycle
let backend_state_middleware = move |mut req: Request, next: Next| {
let state = state.clone();
async move {
req.extensions_mut().insert(state);
let response = next.run(req).await;
Ok::<Response<Body>, (StatusCode, &'static str)>(response)
}
};
let conf = get_configuration(None).unwrap();
let leptos_options = conf.leptos_options;
let addr = leptos_options.site_addr;
// Generate the list of routes in your Leptos App
let routes = generate_route_list(App);
let app = Router::new()
.leptos_routes(&leptos_options, routes, App)
.route("/assets/audio/:song", get(|Path(song) : Path<String>| get_asset_file(song, AssetType::Audio)))
.route("/assets/images/:image", get(|Path(image) : Path<String>| get_asset_file(image, AssetType::Image)))
.route("/assets/*uri", get(|uri| get_static_file(uri, "")))
.leptos_routes(&leptos_options, routes, {
let leptos_options = leptos_options.clone();
move || shell(leptos_options.clone())
})
.nest_service(AUDIO_WEB_PATH, ServeDir::new(audio_path))
.nest_service(IMAGE_WEB_PATH, ServeDir::new(image_path))
.layer(from_fn(require_auth_middleware))
.layer(auth_layer)
.fallback(file_and_error_handler)
.layer(from_fn(backend_state_middleware))
.fallback(file_and_error_handler(shell))
.layer(CompressionLayer::new())
.with_state(leptos_options);
let listener = tokio::net::TcpListener::bind(&addr).await.expect(&format!("Could not bind to {}", &addr));
let listener = tokio::net::TcpListener::bind(&addr)
.await
.unwrap_or_else(|_| panic!("Could not bind to {}", &addr));
info!("Listening on http://{}", &addr);
axum::serve(listener, app.into_make_service()).await.expect("Server failed");
axum::serve(listener, app.into_make_service())
.await
.expect("Server failed");
}
#[cfg(not(feature = "ssr"))]

View File

@@ -1,787 +0,0 @@
use chrono::{NaiveDate, NaiveDateTime};
use serde::{Deserialize, Serialize};
use cfg_if::cfg_if;
cfg_if! {
if #[cfg(feature = "ssr")] {
use diesel::prelude::*;
use crate::database::*;
use std::error::Error;
use crate::songdata::SongData;
use crate::albumdata::AlbumData;
}
}
// These "models" are used to represent the data in the database
// Diesel uses these models to generate the SQL queries that are used to interact with the database.
// These types are also used for API endpoints, for consistency. Because the file must be compiled
// for both the server and the client, we use the `cfg_attr` attribute to conditionally add
// diesel-specific attributes to the models when compiling for the server
/// Model for a "User", used for querying the database
/// Various fields are wrapped in Options, because they are not always wanted for inserts/retrieval
/// Using deserialize_as makes Diesel use the specified type when deserializing from the database,
/// and then call .into() to convert it into the Option
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::users))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct User {
/// A unique id for the user
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
// #[cfg_attr(feature = "ssr", diesel(skip_insertion))] // This feature is not yet released
pub id: Option<i32>,
/// The user's username
pub username: String,
/// The user's email
pub email: String,
/// The user's password, stored as a hash
#[cfg_attr(feature = "ssr", diesel(deserialize_as = String))]
pub password: Option<String>,
/// The time the user was created
#[cfg_attr(feature = "ssr", diesel(deserialize_as = NaiveDateTime))]
pub created_at: Option<NaiveDateTime>,
/// Whether the user is an admin
pub admin: bool,
}
impl User {
/// Get the history of songs listened to by this user from the database
///
/// The returned history will be ordered by date in descending order,
/// and a limit of N will select the N most recent entries.
/// The `id` field of this user must be present (Some) to get history
///
/// # Arguments
///
/// * `limit` - An optional limit on the number of history entries to return
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<HistoryEntry>, Box<dyn Error>>` -
/// A result indicating success with a vector of history entries, or an error
///
#[cfg(feature = "ssr")]
pub fn get_history(self: &Self, limit: Option<i64>, conn: &mut PgPooledConn) ->
Result<Vec<HistoryEntry>, Box<dyn Error>> {
use crate::schema::song_history::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to get history")?;
let my_history =
if let Some(limit) = limit {
song_history
.filter(user_id.eq(my_id))
.order(date.desc())
.limit(limit)
.load(conn)?
} else {
song_history
.filter(user_id.eq(my_id))
.load(conn)?
};
Ok(my_history)
}
/// Get the history of songs listened to by this user from the database
///
/// The returned history will be ordered by date in descending order,
/// and a limit of N will select the N most recent entries.
/// The `id` field of this user must be present (Some) to get history
///
/// # Arguments
///
/// * `limit` - An optional limit on the number of history entries to return
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<(SystemTime, Song)>, Box<dyn Error>>` -
/// A result indicating success with a vector of listen dates and songs, or an error
///
#[cfg(feature = "ssr")]
pub fn get_history_songs(self: &Self, limit: Option<i64>, conn: &mut PgPooledConn) ->
Result<Vec<(NaiveDateTime, Song)>, Box<dyn Error>> {
use crate::schema::songs::dsl::*;
use crate::schema::song_history::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to get history")?;
let my_history =
if let Some(limit) = limit {
song_history
.inner_join(songs)
.filter(user_id.eq(my_id))
.order(date.desc())
.limit(limit)
.select((date, songs::all_columns()))
.load(conn)?
} else {
song_history
.inner_join(songs)
.filter(user_id.eq(my_id))
.order(date.desc())
.select((date, songs::all_columns()))
.load(conn)?
};
Ok(my_history)
}
/// Add a song to this user's history in the database
///
/// The date of the history entry will be the current time
/// The `id` field of this user must be present (Some) to add history
///
/// # Arguments
///
/// * `song_id` - The id of the song to add to this user's history
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<(), Box<dyn Error>>` - A result indicating success with an empty value, or an error
///
#[cfg(feature = "ssr")]
pub fn add_history(self: &Self, song_id: i32, conn: &mut PgPooledConn) -> Result<(), Box<dyn Error>> {
use crate::schema::song_history;
let my_id = self.id.ok_or("Artist id must be present (Some) to add history")?;
diesel::insert_into(song_history::table)
.values((song_history::user_id.eq(my_id), song_history::song_id.eq(song_id)))
.execute(conn)?;
Ok(())
}
/// Check if this user has listened to a song
///
/// The `id` field of this user must be present (Some) to check history
///
/// # Arguments
///
/// * `song_id` - The id of the song to check if this user has listened to
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<bool, Box<dyn Error>>` - A result indicating success with a boolean value, or an error
///
#[cfg(feature = "ssr")]
pub fn has_listened_to(self: &Self, song_id: i32, conn: &mut PgPooledConn) -> Result<bool, Box<dyn Error>> {
use crate::schema::song_history::{self, user_id};
let my_id = self.id.ok_or("Artist id must be present (Some) to check history")?;
let has_listened = song_history::table
.filter(user_id.eq(my_id))
.filter(song_history::song_id.eq(song_id))
.first::<HistoryEntry>(conn)
.optional()?
.is_some();
Ok(has_listened)
}
/// Like or unlike a song for this user
/// If likeing a song, remove dislike if it exists
#[cfg(feature = "ssr")]
pub async fn set_like_song(self: &Self, song_id: i32, like: bool, conn: &mut PgPooledConn) ->
Result<(), Box<dyn Error>> {
use log::*;
debug!("Setting like for song {} to {}", song_id, like);
use crate::schema::song_likes;
use crate::schema::song_dislikes;
let my_id = self.id.ok_or("User id must be present (Some) to like/un-like a song")?;
if like {
diesel::insert_into(song_likes::table)
.values((song_likes::song_id.eq(song_id), song_likes::user_id.eq(my_id)))
.execute(conn)?;
// Remove dislike if it exists
diesel::delete(song_dislikes::table.filter(song_dislikes::song_id.eq(song_id)
.and(song_dislikes::user_id.eq(my_id))))
.execute(conn)?;
} else {
diesel::delete(song_likes::table.filter(song_likes::song_id.eq(song_id).and(song_likes::user_id.eq(my_id))))
.execute(conn)?;
}
Ok(())
}
/// Get the like status of a song for this user
#[cfg(feature = "ssr")]
pub async fn get_like_song(self: &Self, song_id: i32, conn: &mut PgPooledConn) -> Result<bool, Box<dyn Error>> {
use crate::schema::song_likes;
let my_id = self.id.ok_or("User id must be present (Some) to get like status of a song")?;
let like = song_likes::table
.filter(song_likes::song_id.eq(song_id).and(song_likes::user_id.eq(my_id)))
.first::<(i32, i32)>(conn)
.optional()?
.is_some();
Ok(like)
}
/// Get songs liked by this user
#[cfg(feature = "ssr")]
pub async fn get_liked_songs(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Song>, Box<dyn Error>> {
use crate::schema::songs::dsl::*;
use crate::schema::song_likes::dsl::*;
let my_id = self.id.ok_or("User id must be present (Some) to get liked songs")?;
let my_songs = songs
.inner_join(song_likes)
.filter(user_id.eq(my_id))
.select(songs::all_columns())
.load(conn)?;
Ok(my_songs)
}
/// Dislike or remove dislike from a song for this user
/// If disliking a song, remove like if it exists
#[cfg(feature = "ssr")]
pub async fn set_dislike_song(self: &Self, song_id: i32, dislike: bool, conn: &mut PgPooledConn) ->
Result<(), Box<dyn Error>> {
use log::*;
debug!("Setting dislike for song {} to {}", song_id, dislike);
use crate::schema::song_likes;
use crate::schema::song_dislikes;
let my_id = self.id.ok_or("User id must be present (Some) to dislike/un-dislike a song")?;
if dislike {
diesel::insert_into(song_dislikes::table)
.values((song_dislikes::song_id.eq(song_id), song_dislikes::user_id.eq(my_id)))
.execute(conn)?;
// Remove like if it exists
diesel::delete(song_likes::table.filter(song_likes::song_id.eq(song_id)
.and(song_likes::user_id.eq(my_id))))
.execute(conn)?;
} else {
diesel::delete(song_dislikes::table.filter(song_dislikes::song_id.eq(song_id)
.and(song_dislikes::user_id.eq(my_id))))
.execute(conn)?;
}
Ok(())
}
/// Get the dislike status of a song for this user
#[cfg(feature = "ssr")]
pub async fn get_dislike_song(self: &Self, song_id: i32, conn: &mut PgPooledConn) -> Result<bool, Box<dyn Error>> {
use crate::schema::song_dislikes;
let my_id = self.id.ok_or("User id must be present (Some) to get dislike status of a song")?;
let dislike = song_dislikes::table
.filter(song_dislikes::song_id.eq(song_id).and(song_dislikes::user_id.eq(my_id)))
.first::<(i32, i32)>(conn)
.optional()?
.is_some();
Ok(dislike)
}
/// Get songs disliked by this user
#[cfg(feature = "ssr")]
pub async fn get_disliked_songs(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Song>, Box<dyn Error>> {
use crate::schema::songs::dsl::*;
use crate::schema::song_likes::dsl::*;
let my_id = self.id.ok_or("User id must be present (Some) to get disliked songs")?;
let my_songs = songs
.inner_join(song_likes)
.filter(user_id.eq(my_id))
.select(songs::all_columns())
.load(conn)?;
Ok(my_songs)
}
}
/// Model for an artist
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable, Identifiable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::artists))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct Artist {
/// A unique id for the artist
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
pub id: Option<i32>,
/// The artist's name
pub name: String,
}
impl Artist {
/// Add an album to this artist in the database
///
/// # Arguments
///
/// * `new_album_id` - The id of the album to add to this artist
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<(), Box<dyn Error>>` - A result indicating success with an empty value, or an error
///
#[cfg(feature = "ssr")]
pub fn add_album(self: &Self, new_album_id: i32, conn: &mut PgPooledConn) -> Result<(), Box<dyn Error>> {
use crate::schema::album_artists::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to add an album")?;
diesel::insert_into(album_artists)
.values((album_id.eq(new_album_id), artist_id.eq(my_id)))
.execute(conn)?;
Ok(())
}
/// Get albums by artist from the database
///
/// The `id` field of this artist must be present (Some) to get albums
///
/// # Arguments
///
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<Album>, Box<dyn Error>>` - A result indicating success with a vector of albums, or an error
///
#[cfg(feature = "ssr")]
pub fn get_albums(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Album>, Box<dyn Error>> {
use crate::schema::albums::dsl::*;
use crate::schema::album_artists::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to get albums")?;
let my_albums = albums
.inner_join(album_artists)
.filter(artist_id.eq(my_id))
.select(albums::all_columns())
.load(conn)?;
Ok(my_albums)
}
/// Add a song to this artist in the database
///
/// The `id` field of this artist must be present (Some) to add a song
///
/// # Arguments
///
/// * `new_song_id` - The id of the song to add to this artist
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<(), Box<dyn Error>>` - A result indicating success with an empty value, or an error
///
#[cfg(feature = "ssr")]
pub fn add_song(self: &Self, new_song_id: i32, conn: &mut PgPooledConn) -> Result<(), Box<dyn Error>> {
use crate::schema::song_artists::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to add an album")?;
diesel::insert_into(song_artists)
.values((song_id.eq(new_song_id), artist_id.eq(my_id)))
.execute(conn)?;
Ok(())
}
/// Get songs by this artist from the database
///
/// The `id` field of this artist must be present (Some) to get songs
///
/// # Arguments
///
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<Song>, Box<dyn Error>>` - A result indicating success with a vector of songs, or an error
///
#[cfg(feature = "ssr")]
pub fn get_songs(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Song>, Box<dyn Error>> {
use crate::schema::songs::dsl::*;
use crate::schema::song_artists::dsl::*;
let my_id = self.id.ok_or("Artist id must be present (Some) to get songs")?;
let my_songs = songs
.inner_join(song_artists)
.filter(artist_id.eq(my_id))
.select(songs::all_columns())
.load(conn)?;
Ok(my_songs)
}
/// Display a list of artists as a string.
///
/// For one artist, displays [artist1]. For two artists, displays [artist1] & [artist2].
/// For three or more artists, displays [artist1], [artist2], & [artist3].
pub fn display_list(artists: &Vec<Artist>) -> String {
let mut artist_list = String::new();
for (i, artist) in artists.iter().enumerate() {
if i == 0 {
artist_list.push_str(&artist.name);
} else if i == artists.len() - 1 {
artist_list.push_str(&format!(" & {}", artist.name));
} else {
artist_list.push_str(&format!(", {}", artist.name));
}
}
artist_list
}
}
/// Model for an album
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable, Identifiable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::albums))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct Album {
/// A unique id for the album
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
pub id: Option<i32>,
/// The album's title
pub title: String,
/// The album's release date
pub release_date: Option<NaiveDate>,
/// The path to the album's image file
pub image_path: Option<String>,
}
impl Album {
/// Add an artist to this album in the database
///
/// The `id` field of this album must be present (Some) to add an artist
///
/// # Arguments
///
/// * `new_artist_id` - The id of the artist to add to this album
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<(), Box<dyn Error>>` - A result indicating success with an empty value, or an error
///
#[cfg(feature = "ssr")]
pub fn add_artist(self: &Self, new_artist_id: i32, conn: &mut PgPooledConn) -> Result<(), Box<dyn Error>> {
use crate::schema::album_artists::dsl::*;
let my_id = self.id.ok_or("Album id must be present (Some) to add an artist")?;
diesel::insert_into(album_artists)
.values((album_id.eq(my_id), artist_id.eq(new_artist_id)))
.execute(conn)?;
Ok(())
}
/// Get songs by this album from the database
///
/// The `id` field of this album must be present (Some) to get songs
///
/// # Arguments
///
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<Song>, Box<dyn Error>>` - A result indicating success with a vector of songs, or an error
///
#[cfg(feature = "ssr")]
pub fn get_songs(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Song>, Box<dyn Error>> {
use crate::schema::songs::dsl::*;
use crate::schema::song_artists::dsl::*;
let my_id = self.id.ok_or("Album id must be present (Some) to get songs")?;
let my_songs = songs
.inner_join(song_artists)
.filter(album_id.eq(my_id))
.select(songs::all_columns())
.load(conn)?;
Ok(my_songs)
}
/// Obtain an album from its albumid
/// # Arguments
///
/// * `album_id` - The id of the album to select
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Album, Box<dyn Error>>` - A result indicating success with the desired album, or an error
///
#[cfg(feature = "ssr")]
pub fn get_album_data(album_id: i32, conn: &mut PgPooledConn) -> Result<AlbumData, Box<dyn Error>> {
use crate::schema::*;
let artist_list: Vec<Artist> = album_artists::table
.filter(album_artists::album_id.eq(album_id))
.inner_join(artists::table.on(album_artists::artist_id.eq(artists::id)))
.select(
artists::all_columns
)
.load(conn)?;
// Get info of album
let albuminfo = albums::table
.filter(albums::id.eq(album_id))
.first::<Album>(conn)?;
let img = albuminfo.image_path.unwrap_or("/assets/images/placeholders/MusicPlaceholder.svg".to_string());
let albumdata = AlbumData {
id: albuminfo.id.unwrap(),
title: albuminfo.title,
artists: artist_list,
release_date: albuminfo.release_date,
image_path: img
};
Ok(albumdata)
}
/// Obtain an album from its albumid
/// # Arguments
///
/// * `album_id` - The id of the album to select
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Album, Box<dyn Error>>` - A result indicating success with the desired album, or an error
///
#[cfg(feature = "ssr")]
pub fn get_song_data(album_id: i32, user_like_dislike: Option<User>, conn: &mut PgPooledConn) -> Result<Vec<SongData>, Box<dyn Error>> {
use crate::schema::*;
use std::collections::HashMap;
let song_list = if let Some(user_like_dislike) = user_like_dislike {
let user_like_dislike_id = user_like_dislike.id.unwrap();
let song_list: Vec<(Album, Option<Song>, Option<Artist>, Option<(i32, i32)>, Option<(i32, i32)>)> =
albums::table
.find(album_id)
.left_join(songs::table.on(albums::id.nullable().eq(songs::album_id)))
.left_join(song_artists::table.inner_join(artists::table).on(songs::id.eq(song_artists::song_id)))
.left_join(song_likes::table.on(songs::id.eq(song_likes::song_id).and(song_likes::user_id.eq(user_like_dislike_id))))
.left_join(song_dislikes::table.on(songs::id.eq(song_dislikes::song_id).and(song_dislikes::user_id.eq(user_like_dislike_id))))
.select((
albums::all_columns,
songs::all_columns.nullable(),
artists::all_columns.nullable(),
song_likes::all_columns.nullable(),
song_dislikes::all_columns.nullable()
))
.order(songs::track.asc())
.load(conn)?;
song_list
} else {
let song_list: Vec<(Album, Option<Song>, Option<Artist>)> =
albums::table
.find(album_id)
.left_join(songs::table.on(albums::id.nullable().eq(songs::album_id)))
.left_join(song_artists::table.inner_join(artists::table).on(songs::id.eq(song_artists::song_id)))
.select((
albums::all_columns,
songs::all_columns.nullable(),
artists::all_columns.nullable()
))
.order(songs::track.asc())
.load(conn)?;
let song_list: Vec<(Album, Option<Song>, Option<Artist>, Option<(i32, i32)>, Option<(i32, i32)>)> =
song_list.into_iter().map( |(album, song, artist)| (album, song, artist, None, None) ).collect();
song_list
};
let mut album_songs: HashMap<i32, SongData> = HashMap::with_capacity(song_list.len());
for (album, song, artist, like, dislike) in song_list {
if let Some(song) = song {
if let Some(stored_songdata) = album_songs.get_mut(&song.id.unwrap()) {
// If the song is already in the map, update the artists
if let Some(artist) = artist {
stored_songdata.artists.push(artist);
}
} else {
let like_dislike = match (like, dislike) {
(Some(_), Some(_)) => Some((true, true)),
(Some(_), None) => Some((true, false)),
(None, Some(_)) => Some((false, true)),
_ => None,
};
let image_path = song.image_path.unwrap_or(
album.image_path.clone().unwrap_or("/assets/images/placeholders/MusicPlaceholder.svg".to_string()));
let songdata = SongData {
id: song.id.unwrap(),
title: song.title,
artists: artist.map(|artist| vec![artist]).unwrap_or_default(),
album: Some(album),
track: song.track,
duration: song.duration,
release_date: song.release_date,
song_path: song.storage_path,
image_path: image_path,
like_dislike: like_dislike,
};
album_songs.insert(song.id.unwrap(), songdata);
}
}
}
// Sort the songs by date
let mut songdata: Vec<SongData> = album_songs.into_values().collect();
songdata.sort_by(|a, b| a.track.cmp(&b.track));
Ok(songdata)
}
}
/// Model for a song
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::songs))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize)]
pub struct Song {
/// A unique id for the song
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
pub id: Option<i32>,
/// The song's title
pub title: String,
/// The album the song is from
pub album_id: Option<i32>,
/// The track number of the song on the album
pub track: Option<i32>,
/// The duration of the song in seconds
pub duration: i32,
/// The song's release date
pub release_date: Option<NaiveDate>,
/// The path to the song's audio file
pub storage_path: String,
/// The path to the song's image file
pub image_path: Option<String>,
}
impl Song {
/// Add an artist to this song in the database
///
/// The `id` field of this song must be present (Some) to add an artist
///
/// # Arguments
///
/// * `new_artist_id` - The id of the artist to add to this song
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Vec<Artist>, Box<dyn Error>>` - A result indicating success with an empty value, or an error
///
#[cfg(feature = "ssr")]
pub fn get_artists(self: &Self, conn: &mut PgPooledConn) -> Result<Vec<Artist>, Box<dyn Error>> {
use crate::schema::artists::dsl::*;
use crate::schema::song_artists::dsl::*;
let my_id = self.id.ok_or("Song id must be present (Some) to get artists")?;
let my_artists = artists
.inner_join(song_artists)
.filter(song_id.eq(my_id))
.select(artists::all_columns())
.load(conn)?;
Ok(my_artists)
}
/// Get the album for this song from the database
///
/// # Arguments
///
/// * `conn` - A mutable reference to a database connection
///
/// # Returns
///
/// * `Result<Option<Album>, Box<dyn Error>>` - A result indicating success with an album, or None if
/// the song does not have an album, or an error
///
#[cfg(feature = "ssr")]
pub fn get_album(self: &Self, conn: &mut PgPooledConn) -> Result<Option<Album>, Box<dyn Error>> {
use crate::schema::albums::dsl::*;
if let Some(album_id) = self.album_id {
let my_album = albums
.filter(id.eq(album_id))
.first::<Album>(conn)?;
Ok(Some(my_album))
} else {
Ok(None)
}
}
}
/// Model for a history entry
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::song_history))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize)]
pub struct HistoryEntry {
/// A unique id for the history entry
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
pub id: Option<i32>,
/// The id of the user who listened to the song
pub user_id: i32,
/// The date the song was listened to
pub date: NaiveDateTime,
/// The id of the song that was listened to
pub song_id: i32,
}
/// Model for a playlist
#[cfg_attr(feature = "ssr", derive(Queryable, Selectable, Insertable))]
#[cfg_attr(feature = "ssr", diesel(table_name = crate::schema::playlists))]
#[cfg_attr(feature = "ssr", diesel(check_for_backend(diesel::pg::Pg)))]
#[derive(Serialize, Deserialize)]
pub struct Playlist {
/// A unique id for the playlist
#[cfg_attr(feature = "ssr", diesel(deserialize_as = i32))]
pub id: Option<i32>,
/// The time the playlist was created
#[cfg_attr(feature = "ssr", diesel(deserialize_as = NaiveDateTime))]
pub created_at: Option<NaiveDateTime>,
/// The time the playlist was last updated
#[cfg_attr(feature = "ssr", diesel(deserialize_as = NaiveDateTime))]
pub updated_at: Option<NaiveDateTime>,
/// The id of the user who owns the playlist
pub owner_id: i32,
/// The name of the playlist
pub name: String,
}

View File

@@ -0,0 +1,16 @@
use crate::prelude::*;
/// Model for an album
#[db_type(albums)]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct Album {
/// A unique id for the album
#[omit_new]
pub id: i32,
/// The album's title
pub title: String,
/// The album's release date
pub release_date: Option<NaiveDate>,
/// The path to the album's image file
pub image_path: Option<LocalPath>,
}

View File

@@ -0,0 +1,36 @@
use crate::prelude::*;
/// Model for an artist
#[db_type(artists)]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct Artist {
/// A unique id for the artist
#[omit_new]
pub id: i32,
/// The artist's name
pub name: String,
/// The path to the artist's image file
pub image_path: Option<LocalPath>,
}
impl Artist {
/// Display a list of artists as a string.
///
/// For one artist, displays [artist1]. For two artists, displays [artist1] & [artist2].
/// For three or more artists, displays [artist1], [artist2], & [artist3].
pub fn display_list(artists: &[Artist]) -> String {
let mut artist_list = String::new();
for (i, artist) in artists.iter().enumerate() {
if i == 0 {
artist_list.push_str(&artist.name);
} else if i == artists.len() - 1 {
artist_list.push_str(&format!(" & {}", artist.name));
} else {
artist_list.push_str(&format!(", {}", artist.name));
}
}
artist_list
}
}

View File

@@ -0,0 +1,16 @@
use crate::prelude::*;
/// Model for a history entry
#[db_type(crate::schema::song_history)]
#[derive(Serialize, Deserialize)]
pub struct HistoryEntry {
/// A unique id for the history entry
#[omit_new]
pub id: i32,
/// The id of the user who listened to the song
pub user_id: i32,
/// The date the song was listened to
pub date: NaiveDateTime,
/// The id of the song that was listened to
pub song_id: i32,
}

25
src/models/backend/mod.rs Normal file
View File

@@ -0,0 +1,25 @@
// These "models" are used to represent the data in the database
// Diesel uses these models to generate the SQL queries that are used to interact with the database.
// These types are also used for API endpoints, for consistency. Because the file must be compiled
// for both the server and the client, we use the `cfg_attr` attribute to conditionally add
// diesel-specific attributes to the models when compiling for the serverub mod user;
pub mod album;
pub mod artist;
pub mod history_entry;
pub mod playlist;
pub mod song;
pub mod user;
pub use album::Album;
pub use album::NewAlbum;
pub use artist::Artist;
pub use artist::NewArtist;
pub use history_entry::HistoryEntry;
pub use history_entry::NewHistoryEntry;
pub use playlist::NewPlaylist;
pub use playlist::Playlist;
pub use song::NewSong;
pub use song::Song;
pub use user::NewUser;
pub use user::User;

View File

@@ -0,0 +1,22 @@
use crate::prelude::*;
/// Model for a playlist
#[db_type(playlists)]
#[derive(Serialize, Deserialize, Clone)]
pub struct Playlist {
/// A unique id for the playlist
#[omit_new]
pub id: i32,
/// The time the playlist was created
#[omit_new]
pub created_at: NaiveDateTime,
/// The time the playlist was last updated
#[omit_new]
pub updated_at: NaiveDateTime,
/// The id of the user who owns the playlist
pub owner_id: i32,
/// The name of the playlist
pub name: String,
/// The path to the playlist's image file
pub image_path: Option<LocalPath>,
}

View File

@@ -0,0 +1,43 @@
use crate::prelude::*;
#[db_type(songs)]
#[derive(Clone, Serialize, Deserialize)]
pub struct Song {
/// A unique id for the song
#[omit_new]
pub id: i32,
/// The song's title
pub title: String,
/// The album the song is from
pub album_id: Option<i32>,
/// The track number of the song on the album
pub track: Option<i32>,
/// The duration of the song in seconds
pub duration: i32,
/// The song's release date
pub release_date: Option<NaiveDate>,
/// The path to the song's audio file
pub storage_path: LocalPath,
/// The path to the song's image file
pub image_path: Option<LocalPath>,
/// The date the song was added to the database
#[omit_new]
pub added_date: NaiveDateTime,
}
impl Song {
#[cfg(feature = "ssr")]
pub fn image_web_path_or_placeholder(&self, album: Option<&backend::Album>) -> WebPath {
self.image_path
.as_ref()
.and_then(|path| {
path.to_web_path(AssetType::Image)
.inspect_err(|e| warn!("{e}"))
.ok()
})
.unwrap_or_else(|| {
let album_image_path = album.and_then(|album| album.image_path.clone());
LocalPath::to_web_path_or_placeholder(album_image_path)
})
}
}

View File

@@ -0,0 +1,27 @@
use crate::prelude::*;
// Model for a "User", used for querying the database
/// Various fields are wrapped in Options, because they are not always wanted for inserts/retrieval
/// Using `deserialize_as` makes Diesel use the specified type when deserializing from the database,
/// and then call `.into()` to convert it into the Option
#[db_type(users)]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct User {
/// A unique id for the user
#[omit_new]
pub id: i32,
/// The user's username
pub username: String,
/// The user's email
pub email: String,
/// The user's password, stored as a hash
#[cfg_attr(feature = "ssr", diesel(deserialize_as = String))]
pub password: Option<String>,
/// The time the user was created
#[omit_new]
pub created_at: NaiveDateTime,
/// Whether the user is an admin
pub admin: bool,
/// The path to the user's profile picture file
pub image_path: Option<LocalPath>,
}

View File

@@ -0,0 +1,33 @@
use crate::prelude::*;
/// Holds information about an album
///
/// Intended to be used in the front-end
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct Album {
/// Album id
pub id: i32,
/// Album title
pub title: String,
/// Album artists
pub artists: Vec<backend::Artist>,
/// Album release date
pub release_date: Option<NaiveDate>,
/// Path to album image, relative to the root of the web server.
/// For example, `"/assets/images/Album.jpg"`
pub image_path: WebPath,
}
impl From<Album> for DashboardTile {
fn from(val: Album) -> Self {
DashboardTile {
image_path: val.image_path.path().into(),
title: val.title.into(),
link: format!("/album/{}", val.id).into(),
description: Some(
format!("Album • {}", backend::Artist::display_list(&val.artists)).into(),
),
}
}
}

View File

@@ -0,0 +1,26 @@
use crate::prelude::*;
/// Holds information about an artist
///
/// Intended to be used in the front-end
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct Artist {
/// Artist id
pub id: i32,
/// Artist name
pub name: String,
/// Path to artist image, relative to the root of the web server.
/// For example, `"/assets/images/Artist.jpg"`
pub image_path: WebPath,
}
impl From<Artist> for DashboardTile {
fn from(val: Artist) -> Self {
DashboardTile {
image_path: val.image_path.path().into(),
title: val.name.into(),
link: format!("/artist/{}", val.id).into(),
description: Some("Artist".into()),
}
}
}

View File

@@ -0,0 +1,13 @@
pub mod album;
pub mod artist;
pub mod playlist;
pub mod playstatus;
pub mod song;
pub mod user;
pub use album::Album;
pub use artist::Artist;
pub use playlist::Playlist;
pub use playstatus::PlayStatus;
pub use song::Song;
pub use user::User;

View File

@@ -0,0 +1,32 @@
use crate::prelude::*;
/// Model for a playlist
#[derive(Serialize, Deserialize, Clone)]
pub struct Playlist {
/// A unique id for the playlist
pub id: i32,
/// The time the playlist was created
pub created_at: NaiveDateTime,
/// The time the playlist was last updated
pub updated_at: NaiveDateTime,
/// The id of the user who owns the playlist
pub owner_id: i32,
/// The name of the playlist
pub name: String,
/// The path to the playlist's image file, relative to the root of the web server
pub image_path: WebPath,
}
#[cfg(feature = "ssr")]
impl From<backend::Playlist> for Playlist {
fn from(playlist: backend::Playlist) -> Self {
Playlist {
id: playlist.id,
created_at: playlist.created_at,
updated_at: playlist.updated_at,
owner_id: playlist.owner_id,
name: playlist.name,
image_path: LocalPath::to_web_path_or_placeholder(playlist.image_path),
}
}
}

Some files were not shown because too many files have changed in this diff Show More