Compare commits
550 Commits
Author | SHA1 | Date |
---|---|---|
Ozzie Isaacs | ab11919c0b | |
Ozzie Isaacs | 58c269881f | |
Ozzie Isaacs | 6760d6971c | |
Kreeblah | ad05534ed2 | |
Ozzie Isaacs | ee451fb236 | |
Ozzie Isaacs | ab13fcf60c | |
Ozzie Isaacs | 6f60ec7b99 | |
Ozzie Isaacs | 894fd9d30a | |
Ozzie Isaacs | 2b1efdb50e | |
Ozzie Isaacs | fc9a9cb9ac | |
Ozzie Isaacs | e99be72ff7 | |
Ozzie Isaacs | f1ceff2b52 | |
Ozzie Isaacs | 7e85894b3a | |
Ozzie Isaacs | 5c49c8cdd7 | |
Ozzie Isaacs | c8c3b3cba3 | |
Ozzie Isaacs | 4911843146 | |
Ozzie Isaacs | 2c37546598 | |
Ozzie Isaacs | 25a875b628 | |
Ozzie Isaacs | 506f0a33cf | |
Ozzie Isaacs | 921caf6716 | |
Ozzie Isaacs | 60ed1904f5 | |
Ozzie Isaacs | cb62d36e44 | |
Ozzie Isaacs | 737d758362 | |
Ozzie Isaacs | 8e27912ff5 | |
Ozzie Isaacs | 3a603cec22 | |
eggy | b1d7badef4 | |
Ozzie Isaacs | e591211b57 | |
Ozzie Isaacs | a305c35de4 | |
growfrow | 51d306b11d | |
mapi68 | abb418fe86 | |
Ozzie Isaacs | 0925f34557 | |
Ozzie Isaacs | 15952a764c | |
Ozzie Isaacs | fcc95bd895 | |
Ghighi Eftimie | 964e7de920 | |
Ozzie Isaacs | 14b578dd3a | |
Ozzie Isaacs | becb84a73d | |
Ozzie Isaacs | c901ccbb01 | |
Ozzie Isaacs | f987fb0aba | |
Ozzie Isaacs | c30460d76b | |
Ozzie Isaacs | 97380b4b3f | |
Ozzie Isaacs | 4fbd064b85 | |
Ozzie Isaacs | abbd9a5888 | |
Ozzie Isaacs | e860b4e097 | |
Ozzie Isaacs | 23a8a4657d | |
Ozzie Isaacs | b38a1b2298 | |
Ozzie Isaacs | 0ebfba8d05 | |
Ozzie Isaacs | 990ad8d72d | |
Ozzie Isaacs | c3fc125501 | |
Ozzie Isaacs | 3c4ed0de1a | |
Ozzie Isaacs | 117c92233d | |
Ozzie Isaacs | 2ba14acf4f | |
Ozzie Isaacs | 80a2d07009 | |
Ozzie Isaacs | ff9e1ed7c8 | |
Ozzie Isaacs | 8e5bee5352 | |
Ozzie Isaacs | d659430116 | |
Ozzie Isaacs | 859dac462b | |
Ozzie Isaacs | 2bea4dbd06 | |
Ozzie Isaacs | 0180b4b6b5 | |
Ozzie Isaacs | 2bfb02c448 | |
Ozzie Isaacs | 4864254e37 | |
Ozzie Isaacs | 09dce28a0e | |
Ozzie Isaacs | e55d09d8bb | |
Ozzie Isaacs | 92c162b2fd | |
Ozzie Isaacs | 57fb5001e2 | |
Ozzie Isaacs | 64e5314148 | |
Ozzie Isaacs | 873602a5c9 | |
Ozzie Isaacs | 09e966e18a | |
Ozzie Isaacs | f7718cae0c | |
Ozzie Isaacs | 90e728516c | |
Ozzie Isaacs | 7c04b68c88 | |
Ozzie Isaacs | 8549689a0f | |
Ozzie Isaacs | d8f5c17518 | |
mapi68 | 05367d2df5 | |
Webysther Sperandio | eb6fbfc90c | |
Ozzie Isaacs | c2267b6902 | |
Ozzie Isaacs | 0e5520a261 | |
Ozzie Isaacs | 6f5e9f167e | |
Ozzie Isaacs | ce83fb6816 | |
Ozzie Isaacs | fbfb7adef6 | |
Ozzie Isaacs | cc52ad5d27 | |
Ozzie Isaacs | 706b9c4013 | |
Ozzie Isaacs | 6972c1b841 | |
Ozzie Isaacs | b9c329535d | |
Ozzie Isaacs | 8fdf7a94ab | |
Ozzie Isaacs | 31a344b410 | |
Ozzie Isaacs | 3814fbf08f | |
Ozzie Isaacs | ffc13a5565 | |
Ozzie Isaacs | 74c61d9685 | |
Ozzie Isaacs | b8031cd53f | |
Ozzie Isaacs | 898e76fc37 | |
Ozzie Isaacs | af71a1a2ed | |
Ozzie Isaacs | e0327db08f | |
Ozzie Isaacs | bf2ac97c47 | |
Ozzie Isaacs | 902fa254b0 | |
Ozzie Isaacs | f0cc93abd3 | |
Ozzie Isaacs | 977f07364b | |
Ozzie Isaacs | 00acd745f4 | |
Ozzie Isaacs | d272f43424 | |
Whatever Cloud | 7a8d8375d0 | |
Johannes H | 3aa75ef4a7 | |
Ozzie Isaacs | 25fb8d934f | |
GONCALVES Nelson (T0025615) | f08c8faaff | |
Michiel Cornelissen | bc0ebdb78d | |
Ozzie Isaacs | 2a4b3cb7af | |
Ozzie Isaacs | 4401cf66d1 | |
Ozzie Isaacs | d353c9b6d3 | |
Ozzie Isaacs | 0aba96c032 | |
Ozzie Isaacs | c60b7e9192 | |
Ozzie Isaacs | 23033255b8 | |
Ozzie Isaacs | 31c8909dea | |
Ozzie Isaacs | 9ef89dbcc3 | |
Ozzie Isaacs | 1086296d1d | |
Ozzie Isaacs | d341faf204 | |
Ozzie Isaacs | 2334e8f9c9 | |
Ozzie Isaacs | 90ad570578 | |
Ozzie Isaacs | fd90d6e375 | |
Ozzie Isaacs | 7fbbb85f47 | |
Ozzie Isaacs | 52c7557878 | |
Ozzie Isaacs | 794cd354ca | |
Ghighi Eftimie | 389e3f09f5 | |
Ghighi Eftimie | 285979b68d | |
Ozzie Isaacs | 3a012c900e | |
Ozzie Isaacs | ec45de3212 | |
Ozzie Isaacs | f644a2a136 | |
Russell | 01108aac42 | |
Russell Troxel | 400c745692 | |
ye | 9841a4d068 | |
Ozzie Isaacs | 7fd1d10fca | |
Ozzie Isaacs | 4f6bbfa8b8 | |
Ozzie Isaacs | cf6810db87 | |
Ozzie Isaacs | 5afff2231e | |
Ozzie Isaacs | d611582b78 | |
Ozzie Isaacs | 3bbd8ee27e | |
Ozzie Isaacs | f78e0ff938 | |
Ozzie Isaacs | bd71391bfb | |
Ozzie Isaacs | 20b2936cc1 | |
Ozzie Isaacs | 19825a635a | |
Ozzie Isaacs | 0d611d35de | |
Ozzie Isaacs | effd026fe2 | |
Ozzie Isaacs | d68e57c4fc | |
Ozzie Isaacs | 184ce23351 | |
Ozzie Isaacs | 2fbc3da451 | |
Ozzie Isaacs | fad6550ff1 | |
Ozzie Isaacs | b7aaa0f24d | |
Ozzie Isaacs | 5040bb762c | |
Ozzie Isaacs | 55deca1ec8 | |
Ozzie Isaacs | 40a16f4717 | |
Ozzie Isaacs | d26e60724a | |
Ozzie Isaacs | d877fa1c68 | |
Ozzie Isaacs | d55bafdfa9 | |
Ozzie Isaacs | a2a431802a | |
Ozzie Isaacs | b2e4907165 | |
Ozzie Isaacs | 6c2e40f544 | |
Ozzie Isaacs | 5e3d0ec2ad | |
Ozzie Isaacs | c550d6c90d | |
bacpd | 3b1d0b4013 | |
Ozzie Isaacs | 3d07efbb4f | |
mapi68 | c0ae5bb381 | |
Ozzie Isaacs | 6e755a26f9 | |
Ozzie Isaacs | c45188beb2 | |
Ozzie Isaacs | 0736c53d7b | |
Ozzie Isaacs | f0f8011d24 | |
Ozzie Isaacs | 65f3ecb924 | |
Ozzie Isaacs | 87b3999ec8 | |
Ozzie Isaacs | e32312b54a | |
Ozzie Isaacs | d7ea569e5d | |
Ozzie Isaacs | 96958e7266 | |
Ozzie Isaacs | 2c339ed10c | |
Ozzie Isaacs | dc2c30f508 | |
Ozzie Isaacs | 0c43d80163 | |
Ozzie Isaacs | df71a86f94 | |
Ozzie Isaacs | 7ed56b4397 | |
Ozzie Isaacs | 5ceb2b6d83 | |
Ozzie Isaacs | 8abea1ddd0 | |
Ozzie Isaacs | 11816d3405 | |
Ozzie Isaacs | 198bff928f | |
lawsssscat | cac200ba61 | |
David K | 8cc36ab081 | |
databoy2k | b3d1558df8 | |
byword77 | a045b6f467 | |
Ozzie Isaacs | 7a961c9011 | |
Ozzie Isaacs | 444ac181f8 | |
Ozzie Isaacs | 4bbcec21e4 | |
Ozzie Isaacs | 5509d4598b | |
Ozzie Isaacs | fab35e69ec | |
Ozzie Isaacs | 4f0f5b1495 | |
Ozzie Isaacs | cfa309f0d1 | |
Ozzie Isaacs | 885d914f18 | |
Ozzie Isaacs | b580f418f7 | |
Ozzie Isaacs | 6a14e2cf68 | |
Ozzie Isaacs | 8535bb5821 | |
Ozzie Isaacs | b2a26a421c | |
Ozzie Isaacs | 7aea7fc0bb | |
Ozzie Isaacs | 52172044e6 | |
Ozzie Isaacs | 9b99427c84 | |
Ozzie Isaacs | 0499e578cd | |
Ozzie Isaacs | b3a85ffcbb | |
PhracturedBlue | 074e611705 | |
Ozzie Isaacs | a1899bf582 | |
Ozzie Isaacs | f7ff3e7cba | |
Ozzie Isaacs | 3a08b91ffa | |
Ozzie Isaacs | d253804a50 | |
Ozzie Isaacs | ba0e5399d6 | |
Ozzie Isaacs | 7818c4a7b0 | |
Ozzie Isaacs | 3f6a12898b | |
Ozzie Isaacs | caf69669cb | |
Ozzie Isaacs | de59181be7 | |
Ozzie Isaacs | 966c9236b9 | |
Ozzie Isaacs | 34c6010ad0 | |
Ozzie Isaacs | 60e904967b | |
Ozzie Isaacs | 3efcbcc679 | |
Ozzie Isaacs | 7bb4bc934c | |
Ozzie Isaacs | dcb8a0f77b | |
Ozzie Isaacs | 2f12b2e315 | |
Ozzie Isaacs | 279f0569e4 | |
Ozzie Isaacs | 6723369d65 | |
Ozzie Isaacs | 4b93ac034f | |
Ozzie Isaacs | fda62dde1d | |
Ozzie Isaacs | df74fdb4d1 | |
Ozzie Isaacs | cce538d5a7 | |
Ozzie Isaacs | e8b0051b31 | |
Ozzie Isaacs | fe55958ecc | |
Ozzie Isaacs | 7b321d63c1 | |
Ozzie Isaacs | 986eaf9f02 | |
Ozzie Isaacs | caf8ed77d7 | |
Ghighi Eftimie | ee5cfa1f36 | |
Horus68 | 5eef476135 | |
Horus68 | b5e4a88357 | |
Horus68 | 256f4bb428 | |
Horus68 | a4d45512ee | |
Horus68 | 074687c330 | |
archont | 2f7b175dda | |
Ozzie Isaacs | a256bd5260 | |
Ozzie Isaacs | fdd1410b06 | |
Ozzie Isaacs | 3f5583017f | |
Ozzie Isaacs | 63b7d70f33 | |
Ozzie Isaacs | 500758050c | |
Ozzie Isaacs | 4b4c0daab0 | |
Ozzie Isaacs | 709a4e51ba | |
Ozzie Isaacs | eff0750d77 | |
Ozzie Isaacs | e63a04093c | |
Ozzie Isaacs | 07d97d18d0 | |
Ozzie Isaacs | d8f30983d5 | |
Ozzie Isaacs | 062efc4e78 | |
boosh | 4e6c9c2703 | |
Ozzie Isaacs | 4dc5885723 | |
quarz12 | 39638d3c9c | |
Ozzie Isaacs | 3ef34c8f15 | |
Ozzie Isaacs | 932abbf090 | |
Ozzie Isaacs | 860443079d | |
Ozzie Isaacs | bd4b7ffaba | |
Ozzie Isaacs | 33e35eeb52 | |
Ozzie Isaacs | ed09814460 | |
Ozzie Isaacs | e52eb74121 | |
Daniel | dc7fbce4f7 | |
Ozzie Isaacs | dad0fd5a1c | |
Ozzie Isaacs | 8a87c152b4 | |
Ozzie Isaacs | 16baa306c5 | |
Daniel | 2eb334fb3d | |
Ozzie Isaacs | cb7356a04d | |
Ozzie Isaacs | 63a561bf9b | |
Ozzie Isaacs | cc733454b2 | |
whilenot | 940544577a | |
xlivevil | 9e0fc320cb | |
xlivevil | bf3ca20fb2 | |
Ozzie Isaacs | c7e1736ade | |
Ozzie Isaacs | bc6a50550e | |
Ozzie Isaacs | fe4dc1bb8f | |
Ozzie Isaacs | f2369609e8 | |
Ozzie Isaacs | de4d6ec7df | |
mapi68 | 7754f4aa5d | |
Ozzie Isaacs | 524751ea51 | |
Ozzie Isaacs | 8111d0dd51 | |
Ozzie Isaacs | fad5929253 | |
Ozzie Isaacs | 9f28144779 | |
Ozzie Isaacs | 42fd6973a0 | |
driz | b2e20ff50c | |
Ozzie Isaacs | 6075b3dd1d | |
driz | 37871ea8cb | |
Ozzie Isaacs | e2785c3985 | |
Ozzie Isaacs | dba83a2900 | |
Ozzie Isaacs | 33c19b20f4 | |
Ozzie Isaacs | d2f39d3dce | |
Ozzie Isaacs | 1c8bc78b48 | |
Ozzie Isaacs | 6c6841f8b0 | |
Ozzie Isaacs | 592216588c | |
Wladimir Kirianov | f4db0f04d2 | |
Wladimir Kirianov | b16e3a6e2c | |
Ozzie Isaacs | 13c0d30a8f | |
Ozzie Isaacs | b9c942befc | |
Ozzie Isaacs | a68a0dd037 | |
Thomas de Ruiter | a952c36ab7 | |
Thomas de Ruiter | 5f0c7737fe | |
Ozzie Isaacs | 38484624e9 | |
Ozzie Isaacs | 1451a67912 | |
Ozzie Isaacs | a72f0a160b | |
Ozzie Isaacs | 253386b0a5 | |
Ozzie Isaacs | 6c8ffb3e7e | |
Ozzie Isaacs | 7d26e6fc85 | |
Ozzie Isaacs | 085a6b88a3 | |
Ozzie Isaacs | bde36e3cd4 | |
Ozzie Isaacs | 9646b6e2dd | |
Ozzie Isaacs | d35e781d41 | |
Ozzie Isaacs | 321db4d712 | |
Ozzie Isaacs | 2b9f920454 | |
Ozzie Isaacs | 45acd3febe | |
Ozzie Isaacs | cbd7ca2f3e | |
Ozzie Isaacs | ba7fee3918 | |
Ozzie Isaacs | 1210ccb43f | |
Jerry Vonau | 04f1f6493b | |
Ozzie Isaacs | 46d2d217ee | |
Ozzie Isaacs | e3fffa8a8f | |
Ozzie Isaacs | dfb49bfca9 | |
Ozzie Isaacs | 224777f5e3 | |
Ozzie Isaacs | 7ade4615a4 | |
Ozzie Isaacs | cbd679eb24 | |
Ozzie Isaacs | b277ed3359 | |
Ozzie Isaacs | fa95b07a95 | |
Ozzie Isaacs | 87bc8c6d96 | |
Ozzie Isaacs | db2bc6a2c2 | |
Ozzie Isaacs | cf850c6ed5 | |
Ozzie Isaacs | a6b54e398b | |
Ozzie Isaacs | 28eeb9eec3 | |
Ozzie Isaacs | 7d76f2ae33 | |
Ozzie Isaacs | 49e4f540c9 | |
Ozzie Isaacs | 64e9b13311 | |
Ozzie Isaacs | 3cf778b591 | |
Ozzie Isaacs | 942bcff5c4 | |
Ozzie Isaacs | 5c5db34a52 | |
Ozzie Isaacs | ae850172a3 | |
Ozzie Isaacs | 7ff4747f63 | |
Ozzie Isaacs | 76b0411c33 | |
Ozzie Isaacs | a414db0243 | |
Ozzie Isaacs | 162ac73bee | |
Ozzie Isaacs | fc31132f4e | |
Ozzie Isaacs | 856dce8add | |
Ozzie Isaacs | 3debe4aa4b | |
Ozzie Isaacs | b28a2cc58c | |
Ozzie Isaacs | 5fd0e4c046 | |
Ozzie Isaacs | 595f01e7a3 | |
Ozzie Isaacs | c79aa75f00 | |
Ozzie Isaacs | 0177a8bcca | |
Ozzie Isaacs | 38c601bb10 | |
Ozzie Isaacs | e7a6fe0bec | |
Ozzie Isaacs | 6119eb3681 | |
Ozzie Isaacs | 6b2ca9537d | |
Ozzie Isaacs | 3cb9a9b04a | |
Ozzie Isaacs | 3d8256b6a6 | |
Ozzie Isaacs | 660d1fb1ff | |
Ozzie Isaacs | fa3fe47059 | |
Ozzie Isaacs | 89bc72958e | |
Ozzie Isaacs | 73ea18b8ce | |
Ozzie Isaacs | 7ca07f06ce | |
Ozzie Isaacs | 8ee34bf428 | |
Ozzie Isaacs | 66d5b5a697 | |
Ozzie Isaacs | ce48e06c45 | |
Ozzie Isaacs | f4ecfe4aca | |
Ozzie Isaacs | dda20eb912 | |
GarcaMan | c4326c9495 | |
Ozzie Isaacs | 63a3edd429 | |
Ozzie Isaacs | 3b45234beb | |
Ozzie Isaacs | 8d0a699078 | |
Ozzie Isaacs | 5b5146a793 | |
Ozzie Isaacs | 7a4e6fbdfb | |
Ozzie Isaacs | 14d14637cd | |
Ozzie Isaacs | fb42f6bfff | |
Ozzie Isaacs | 4b7a0f3662 | |
Ozzie Isaacs | 275675b48a | |
Ozzie Isaacs | 907606295d | |
Ozzie Isaacs | 794c6ba254 | |
Ozzie Isaacs | ac13f6042a | |
Ozzie Isaacs | f8fbc807f1 | |
Ozzie Isaacs | 98da7dd5b0 | |
Ozzie Isaacs | 1c3b69c710 | |
mapi68 | 1dd638a786 | |
_Fervor_ | 6da7d05c6c | |
_Fervor_ | 3f72c3fffe | |
Ozzie Isaacs | cf9a7d538f | |
Ozzie Isaacs | ea9e8d4384 | |
Ozzie Isaacs | b9769a0975 | |
Ozzie Isaacs | 3a262661b5 | |
Ozzie Isaacs | d2056ceb51 | |
Ozzie Isaacs | e71a3452e1 | |
Ozzie Isaacs | 189da65fac | |
Ozzie Isaacs | 0e6b7f96d3 | |
Ozzie Isaacs | 1babb566fb | |
Ozzie Isaacs | c4e4acfc26 | |
Ozzie Isaacs | 6afb429185 | |
Ozzie Isaacs | f241b260d7 | |
Ozzie Isaacs | 260a694834 | |
Ozzie Isaacs | 508e2b4d0a | |
Ozzie Isaacs | 9701a97a57 | |
Ozzie Isaacs | 4913f06e0d | |
Petipopotam | d545ea9e6f | |
Petipopotam | 1ad8dc102a | |
Ozzie Isaacs | 36cb454d1c | |
Ozzie Isaacs | 1899cda8d1 | |
Ozzie Isaacs | 8dd4d0be1b | |
Ozzie Isaacs | d48d6880af | |
Ozzie Isaacs | 94a6931d48 | |
Ozzie Isaacs | c21a870b8e | |
Ozzie Isaacs | 791bc9621a | |
Ozzie Isaacs | 2d6fe483ba | |
Ozzie Isaacs | ad43f07dab | |
Ozzie Isaacs | 77637d81dd | |
Ozzie Isaacs | a2bf6dfb7b | |
Ozzie Isaacs | 1cd05d614c | |
Ozzie Isaacs | d75f681247 | |
Ozzie Isaacs | 2be2920833 | |
Ozzie Isaacs | d6184619f5 | |
Ozzie Isaacs | 43ee85fbb5 | |
Ozzie Isaacs | 8022b1bb36 | |
Ozzie Isaacs | 9e75c65af8 | |
Ozzie Isaacs | 7881950e66 | |
Ozzie Isaacs | 031658ae94 | |
Arief Hidayat | 48c2c7b543 | |
Petipopotam | beb619c2c2 | |
Petipopotam | ed22209e6c | |
blitzmann | 364c48edd8 | |
Ozzie Isaacs | e178efb58c | |
Vegard Fladby | 4105c64320 | |
Josh O'Brien | b3335f6733 | |
Benedikt McMullin | fba95956de | |
Ozzie Isaacs | ce0b3d8d10 | |
Ozzie Isaacs | 9545aa2a0b | |
jvoisin | 4629eec774 | |
Jeroen Kroese | 4977381b1c | |
Ozzie Isaacs | 6c1631acba | |
Ozzie Isaacs | 1489228649 | |
Ozzie Isaacs | 74efa52f26 | |
Ozzie Isaacs | 1ca1281346 | |
jvoisin | 631496775e | |
jvoisin | c5e539bbcd | |
jvoisin | 02ec853e3b | |
Ozzie Isaacs | d0411fd9c7 | |
Ozzie Isaacs | 567cb2e097 | |
Ozzie Isaacs | a635e136be | |
Ozzie Isaacs | 5ffb3e917f | |
Ozzie Isaacs | 5dc3385ae5 | |
Ozzie Isaacs | 66e0a81d23 | |
Ozzie Isaacs | 1f6eb2def6 | |
Ozzie Isaacs | 7d3af5bbd0 | |
Ozzie Isaacs | 043a612d1a | |
Ozzie Isaacs | 928e24fd1a | |
Ozzie Isaacs | 3361c41c6d | |
Ozzie Isaacs | 85a6616606 | |
Ozzie Isaacs | c15b603fef | |
Ozzie Isaacs | b12e47d0e5 | |
Ozzie Isaacs | 389263f5e7 | |
Ozzie Isaacs | 307b4526f6 | |
jvoisin | 7d023ce741 | |
Julien Voisin | 2ddbaa2150 | |
jvoisin | 29fef4a314 | |
JonathanHerrewijnen | 9450084d6e | |
Vijay Pillai | b52c7aac53 | |
Feige-cn | e8c461b14f | |
Ghighi Eftimie | 9409b9db9c | |
Ghighi Eftimie | a992aafc13 | |
Ghighi Eftimie | b663f1ce83 | |
Olivier | b45d69ef2d | |
Olivier | a80735d7d3 | |
Olivier | adfbd447ed | |
xlivevil | 73567db4fb | |
Ozzieisaacs | 3d59a78c9f | |
Ozzieisaacs | 8ba23ac3ee | |
Ghighi Eftimie | 397cd987cb | |
Ozzie Isaacs | 7eef44f73c | |
Ozzie Isaacs | e22ecda137 | |
ElQuimm | a003cd9758 | |
Ozzie Isaacs | 44f6655dd2 | |
Ozzie Isaacs | bd52f08a30 | |
Ozzie Isaacs | edc9703716 | |
Ozzie Isaacs | 56d697122c | |
Ozzie Isaacs | d39a43e838 | |
ElQuimm | 9df3a2558d | |
xlivevil | 7339c804a3 | |
xlivevil | 4d61c5535e | |
xlivevil | 09e1ec3d08 | |
Ozzie Isaacs | 8421a017f4 | |
Ozzie Isaacs | 27eb514ca4 | |
Ozzie Isaacs | b4d9e400d9 | |
Ozzie Isaacs | 67bc23ee0c | |
Ozzie Isaacs | b898b37e29 | |
Ozzie Isaacs | 10dcf39d50 | |
Ozzie Isaacs | e676e1685b | |
Ozzie Isaacs | 59a5ccd05c | |
Ozzie Isaacs | 04908e22fe | |
Ozzie Isaacs | 0f67e57be4 | |
Ozzie Isaacs | 071d19b8b3 | |
halink0803 | 1ffa190938 | |
Ozzie Isaacs | c10708ed07 | |
Ozzie Isaacs | b4851e1d70 | |
Ozzie Isaacs | 26be5ee237 | |
Ozzieisaacs | 241aa77d41 | |
Ozzieisaacs | ca0ee5d391 | |
Ozzieisaacs | 110d283a50 | |
Giulio De Pasquale | f6a9030c33 | |
Giulio De Pasquale | 452093db47 | |
Ozzieisaacs | 9fa56a2323 | |
Ozzieisaacs | 3a133901e4 | |
Ozzieisaacs | 7750ebde0f | |
Ozzieisaacs | 2472e03a69 | |
Ozzieisaacs | 6598c4d259 | |
Ozzie Isaacs | a9b20ca136 | |
Ozzie Isaacs | bf0375d51d | |
Ozzie Isaacs | 89d226e36b | |
Ozzie Isaacs | ec8844c7d4 | |
Ozzie Isaacs | e5c8a7ce50 | |
Ozzie Isaacs | dc3cafd23d | |
Ozzie Isaacs | 9de474e665 | |
Martin Brodbeck | cd143b7ef4 | |
Martin Brodbeck | 8a5112502d | |
viljasenville | 46e5305f23 | |
Thore Schillmann | 9bcbe523d7 | |
Ozzie Isaacs | ae3e3559b8 | |
Thore Schillmann | e176d63ca6 | |
Thore Schillmann | 80b0e88650 | |
Thore Schillmann | 0b4731913e | |
Thore Schillmann | fc7ce8da2d | |
Thore Schillmann | c89bc12c9b | |
Thore Schillmann | 4913673e8f | |
Thore Schillmann | fc004f4f0c | |
Ozzie Isaacs | 7344ef353c | |
Ozzie Isaacs | 3bde8a5d95 | |
Thore Schillmann | c5c3874243 | |
Thore Schillmann | 0d34f41a48 | |
Thore Schillmann | a77aef83c6 | |
Thore Schillmann | e39c6130c3 | |
Thore Schillmann | 03359599ed | |
Thore Schillmann | 3c4330ba51 | |
Thore Schillmann | 8c781ad4a4 | |
Thore Schillmann | 5e9ec706c5 | |
Ozzie Isaacs | b1c70d5b4a | |
Ozzieisaacs | c5fc30a1be | |
Ozzie Isaacs | 29fd4ae4a2 | |
Ozzieisaacs | 4ef8c35fb7 | |
Ozzieisaacs | 04326af2da | |
Ozzieisaacs | d6a31e5db8 | |
Ozzie Isaacs | 73d48e4ac1 | |
Ozzie Isaacs | b206b7a5d8 | |
Thore Schillmann | 2816a75c3e | |
GarcaMan | bf12542df5 | |
Thore Schillmann | 0f3f918153 | |
Evan Peterson | 7ae9f89bbf | |
Evan Peterson | 4eaa9413f9 | |
GarcaMan | e2eab808c0 | |
GarcaMan | 3ac08a8c0d | |
Denis Rodríguez | 3f56f0dca7 | |
GarcaMan | 7fc04b353b | |
GarcaMan | a8689ae26b |
|
@ -1,4 +1,5 @@
|
||||||
constants.py ident export-subst
|
constants.py ident export-subst
|
||||||
/test export-ignore
|
/test export-ignore
|
||||||
|
/library export-ignore
|
||||||
cps/static/css/libs/* linguist-vendored
|
cps/static/css/libs/* linguist-vendored
|
||||||
cps/static/js/libs/* linguist-vendored
|
cps/static/js/libs/* linguist-vendored
|
||||||
|
|
|
@ -6,12 +6,23 @@ labels: ''
|
||||||
assignees: ''
|
assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
|
||||||
|
|
||||||
**Describe the bug/problem**
|
## Short Notice from the maintainer
|
||||||
|
|
||||||
|
After 6 years of more or less intensive programming on Calibre-Web, I need a break.
|
||||||
|
The last few months, maintaining Calibre-Web has felt more like work than a hobby. I felt pressured and teased by people to solve "their" problems and merge PRs for "their" Calibre-Web.
|
||||||
|
I have turned off all notifications from Github/Discord and will now concentrate undisturbed on the development of “my” Calibre-Web over the next few weeks/months.
|
||||||
|
I will look into the issues and maybe also the PRs from time to time, but don't expect a quick response from me.
|
||||||
|
|
||||||
|
|
||||||
|
Please also have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
||||||
|
|
||||||
|
**Describe the bug/problem**
|
||||||
|
|
||||||
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
|
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
|
||||||
|
|
||||||
**To Reproduce**
|
**To Reproduce**
|
||||||
|
|
||||||
Steps to reproduce the behavior:
|
Steps to reproduce the behavior:
|
||||||
1. Go to '...'
|
1. Go to '...'
|
||||||
2. Click on '....'
|
2. Click on '....'
|
||||||
|
@ -19,15 +30,19 @@ Steps to reproduce the behavior:
|
||||||
4. See error
|
4. See error
|
||||||
|
|
||||||
**Logfile**
|
**Logfile**
|
||||||
|
|
||||||
Add content of calibre-web.log file or the relevant error, try to reproduce your problem with "debug" log-level to get more output.
|
Add content of calibre-web.log file or the relevant error, try to reproduce your problem with "debug" log-level to get more output.
|
||||||
|
|
||||||
**Expected behavior**
|
**Expected behavior**
|
||||||
|
|
||||||
A clear and concise description of what you expected to happen.
|
A clear and concise description of what you expected to happen.
|
||||||
|
|
||||||
**Screenshots**
|
**Screenshots**
|
||||||
|
|
||||||
If applicable, add screenshots to help explain your problem.
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Environment (please complete the following information):**
|
**Environment (please complete the following information):**
|
||||||
|
|
||||||
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
||||||
- Python version: [e.g. python2.7]
|
- Python version: [e.g. python2.7]
|
||||||
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
||||||
|
@ -37,3 +52,4 @@ If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Additional context**
|
**Additional context**
|
||||||
Add any other context about the problem here. [e.g. access via reverse proxy, database background sync, special database location]
|
Add any other context about the problem here. [e.g. access via reverse proxy, database background sync, special database location]
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,14 @@ assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
# Short Notice from the maintainer
|
||||||
|
|
||||||
|
After 6 years of more or less intensive programming on Calibre-Web, I need a break.
|
||||||
|
The last few months, maintaining Calibre-Web has felt more like work than a hobby. I felt pressured and teased by people to solve "their" problems and merge PRs for "their" Calibre-Web.
|
||||||
|
I have turned off all notifications from Github/Discord and will now concentrate undisturbed on the development of “my” Calibre-Web over the next few weeks/months.
|
||||||
|
I will look into the issues and maybe also the PRs from time to time, but don't expect a quick response from me.
|
||||||
|
|
||||||
|
Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
||||||
|
|
||||||
**Is your feature request related to a problem? Please describe.**
|
**Is your feature request related to a problem? Please describe.**
|
||||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||||
|
|
|
@ -28,8 +28,10 @@ cps/cache
|
||||||
.idea/
|
.idea/
|
||||||
*.bak
|
*.bak
|
||||||
*.log.*
|
*.log.*
|
||||||
|
.key
|
||||||
|
|
||||||
settings.yaml
|
settings.yaml
|
||||||
gdrive_credentials
|
gdrive_credentials
|
||||||
client_secrets.json
|
client_secrets.json
|
||||||
gmail.json
|
gmail.json
|
||||||
|
/.key
|
||||||
|
|
|
@ -26,9 +26,9 @@ The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/
|
||||||
|
|
||||||
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Instead, please write an email to "ozzie.fernandez.isaacs@googlemail.com".
|
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Instead, please write an email to "ozzie.fernandez.isaacs@googlemail.com".
|
||||||
|
|
||||||
Ensure the ***bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
|
Ensure the **bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
|
||||||
|
|
||||||
If you're unable to find an **open issue** addressing the problem, open a [new one](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=bug_report.md&title=). Be sure to include a **title** and **clear description**, as much relevant information as possible, the **issue form** helps you providing the right information. Deleting the form and just pasting the stack trace doesn't speed up fixing the problem. If your issue could be resolved, consider closing the issue.
|
If you're unable to find an **open issue** addressing the problem, open a [new one](https://github.com/janeczku/calibre-web/issues/new/choose). Be sure to include a **title** and **clear description**, as much relevant information as possible, the **issue form** helps you providing the right information. Deleting the form and just pasting the stack trace doesn't speed up fixing the problem. If your issue could be resolved, consider closing the issue.
|
||||||
|
|
||||||
### **Feature Request**
|
### **Feature Request**
|
||||||
|
|
||||||
|
|
152
README.md
|
@ -1,99 +1,125 @@
|
||||||
# About
|
# Short Notice from the maintainer
|
||||||
|
|
||||||
Calibre-Web is a web app providing a clean interface for browsing, reading and downloading eBooks using a valid [Calibre](https://calibre-ebook.com) database.
|
After 6 years of more or less intensive programming on Calibre-Web, I need a break.
|
||||||
|
The last few months, maintaining Calibre-Web has felt more like work than a hobby. I felt pressured and teased by people to solve "their" problems and merge PRs for "their" Calibre-Web.
|
||||||
|
I have turned off all notifications from Github/Discord and will now concentrate undisturbed on the development of “my” Calibre-Web over the next few weeks/months.
|
||||||
|
I will look into the issues and maybe also the PRs from time to time, but don't expect a quick response from me.
|
||||||
|
|
||||||
[![GitHub License](https://img.shields.io/github/license/janeczku/calibre-web?style=flat-square)](https://github.com/janeczku/calibre-web/blob/master/LICENSE)
|
# Calibre-Web
|
||||||
[![GitHub commit activity](https://img.shields.io/github/commit-activity/w/janeczku/calibre-web?logo=github&style=flat-square&label=commits)]()
|
|
||||||
[![GitHub all releases](https://img.shields.io/github/downloads/janeczku/calibre-web/total?logo=github&style=flat-square)](https://github.com/janeczku/calibre-web/releases)
|
Calibre-Web is a web app that offers a clean and intuitive interface for browsing, reading, and downloading eBooks using a valid [Calibre](https://calibre-ebook.com) database.
|
||||||
|
|
||||||
|
[![License](https://img.shields.io/github/license/janeczku/calibre-web?style=flat-square)](https://github.com/janeczku/calibre-web/blob/master/LICENSE)
|
||||||
|
![Commit Activity](https://img.shields.io/github/commit-activity/w/janeczku/calibre-web?logo=github&style=flat-square&label=commits)
|
||||||
|
[![All Releases](https://img.shields.io/github/downloads/janeczku/calibre-web/total?logo=github&style=flat-square)](https://github.com/janeczku/calibre-web/releases)
|
||||||
[![PyPI](https://img.shields.io/pypi/v/calibreweb?logo=pypi&logoColor=fff&style=flat-square)](https://pypi.org/project/calibreweb/)
|
[![PyPI](https://img.shields.io/pypi/v/calibreweb?logo=pypi&logoColor=fff&style=flat-square)](https://pypi.org/project/calibreweb/)
|
||||||
[![PyPI - Downloads](https://img.shields.io/pypi/dm/calibreweb?logo=pypi&logoColor=fff&style=flat-square)](https://pypi.org/project/calibreweb/)
|
[![PyPI - Downloads](https://img.shields.io/pypi/dm/calibreweb?logo=pypi&logoColor=fff&style=flat-square)](https://pypi.org/project/calibreweb/)
|
||||||
[![Discord](https://img.shields.io/discord/838810113564344381?label=Discord&logo=discord&style=flat-square)](https://discord.gg/h2VsJ2NEfB)
|
[![Discord](https://img.shields.io/discord/838810113564344381?label=Discord&logo=discord&style=flat-square)](https://discord.gg/h2VsJ2NEfB)
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary><strong>Table of Contents</strong> (click to expand)</summary>
|
||||||
|
|
||||||
|
1. [About](#calibre-web)
|
||||||
|
2. [Features](#features)
|
||||||
|
3. [Installation](#installation)
|
||||||
|
- [Installation via pip (recommended)](#installation-via-pip-recommended)
|
||||||
|
- [Quick start](#quick-start)
|
||||||
|
- [Requirements](#requirements)
|
||||||
|
4. [Docker Images](#docker-images)
|
||||||
|
5. [Contributor Recognition](#contributor-recognition)
|
||||||
|
6. [Contact](#contact)
|
||||||
|
7. [Contributing to Calibre-Web](#contributing-to-calibre-web)
|
||||||
|
|
||||||
|
</details>
|
||||||
|
|
||||||
|
|
||||||
*This software is a fork of [library](https://github.com/mutschler/calibreserver) and licensed under the GPL v3 License.*
|
*This software is a fork of [library](https://github.com/mutschler/calibreserver) and licensed under the GPL v3 License.*
|
||||||
|
|
||||||
![Main screen](https://github.com/janeczku/calibre-web/wiki/images/main_screen.png)
|
![Main screen](https://github.com/janeczku/calibre-web/wiki/images/main_screen.png)
|
||||||
|
|
||||||
## Features
|
## Features
|
||||||
|
|
||||||
- Bootstrap 3 HTML5 interface
|
- Modern and responsive Bootstrap 3 HTML5 interface
|
||||||
- full graphical setup
|
- Full graphical setup
|
||||||
- User management with fine-grained per-user permissions
|
- Comprehensive user management with fine-grained per-user permissions
|
||||||
- Admin interface
|
- Admin interface
|
||||||
- User Interface in brazilian, czech, dutch, english, finnish, french, german, greek, hungarian, italian, japanese, khmer, korean, polish, russian, simplified and traditional chinese, spanish, swedish, turkish, ukrainian
|
- Multilingual user interface supporting 20+ languages ([supported languages](https://github.com/janeczku/calibre-web/wiki/Translation-Status))
|
||||||
- OPDS feed for eBook reader apps
|
- OPDS feed for eBook reader apps
|
||||||
- Filter and search by titles, authors, tags, series, book format and language
|
- Advanced search and filtering options
|
||||||
- Create a custom book collection (shelves)
|
- Custom book collection (shelves) creation
|
||||||
- Support for editing eBook metadata and deleting eBooks from Calibre library
|
- eBook metadata editing and deletion support
|
||||||
- Support for downloading eBook metadata from various sources, sources can be extended via external plugins
|
- Metadata download from various sources (extensible via plugins)
|
||||||
- Support for converting eBooks through Calibre binaries
|
- eBook conversion through Calibre binaries
|
||||||
- Restrict eBook download to logged-in users
|
- eBook download restriction to logged-in users
|
||||||
- Support for public user registration
|
- Public user registration support
|
||||||
- Send eBooks to E-Readers with the click of a button
|
- Send eBooks to E-Readers with a single click
|
||||||
- Sync your Kobo devices through Calibre-Web with your Calibre library
|
- Sync Kobo devices with your Calibre library
|
||||||
- Support for reading eBooks directly in the browser (.txt, .epub, .pdf, .cbr, .cbt, .cbz, .djvu)
|
- In-browser eBook reading support for multiple formats
|
||||||
- Upload new books in many formats, including audio formats (.mp3, .m4a, .m4b)
|
- Upload new books in various formats, including audio formats
|
||||||
- Support for Calibre Custom Columns
|
- Calibre Custom Columns support
|
||||||
- Ability to hide content based on categories and Custom Column content per user
|
- Content hiding based on categories and Custom Column content per user
|
||||||
- Self-update capability
|
- Self-update capability
|
||||||
- "Magic Link" login to make it easy to log on eReaders
|
- "Magic Link" login for easy access on eReaders
|
||||||
- Login via LDAP, google/github oauth and via proxy authentication
|
- LDAP, Google/GitHub OAuth, and proxy authentication support
|
||||||
|
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
#### Installation via pip (recommended)
|
#### Installation via pip (recommended)
|
||||||
1. To avoid problems with already installed python dependencies, it's recommended to create a virtual environment for Calibre-Web
|
1. Create a virtual environment for Calibre-Web to avoid conflicts with existing Python dependencies
|
||||||
2. Install Calibre-Web via pip with the command `pip install calibreweb` (Depending on your OS and or distro the command could also be `pip3`).
|
2. Install Calibre-Web via pip: `pip install calibreweb` (or `pip3` depending on your OS/distro)
|
||||||
3. Optional features can also be installed via pip, please refer to [this page](https://github.com/janeczku/calibre-web/wiki/Dependencies-in-Calibre-Web-Linux-and-Windows) for details
|
3. Install optional features via pip as needed, see [this page](https://github.com/janeczku/calibre-web/wiki/Dependencies-in-Calibre-Web-Linux-and-Windows) for details
|
||||||
4. Calibre-Web can be started afterwards by typing `cps`
|
4. Start Calibre-Web by typing `cps`
|
||||||
|
|
||||||
In the Wiki there are also examples for: a [manual installation](https://github.com/janeczku/calibre-web/wiki/Manual-installation), [installation on Linux Mint](https://github.com/janeczku/calibre-web/wiki/How-To:Install-Calibre-Web-in-Linux-Mint-19-or-20), [installation on a Cloud Provider](https://github.com/janeczku/calibre-web/wiki/How-To:-Install-Calibre-Web-on-a-Cloud-Provider).
|
*Note: Raspberry Pi OS users may encounter issues during installation. If so, please update pip (`./venv/bin/python3 -m pip install --upgrade pip`) and/or install cargo (`sudo apt install cargo`) before retrying the installation.*
|
||||||
|
|
||||||
## Quick start
|
Refer to the Wiki for additional installation examples: [manual installation](https://github.com/janeczku/calibre-web/wiki/Manual-installation), [Linux Mint](https://github.com/janeczku/calibre-web/wiki/How-To:-Install-Calibre-Web-in-Linux-Mint-19-or-20), [Cloud Provider](https://github.com/janeczku/calibre-web/wiki/How-To:-Install-Calibre-Web-on-a-Cloud-Provider).
|
||||||
|
|
||||||
Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog \
|
## Quick Start
|
||||||
Login with default admin login \
|
|
||||||
Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button \
|
|
||||||
Optionally a Google Drive can be used to host the calibre library [-> Using Google Drive integration](https://github.com/janeczku/calibre-web/wiki/G-Drive-Setup#using-google-drive-integration) \
|
|
||||||
Afterwards you can configure your Calibre-Web instance ([Basic Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#basic-configuration) and [UI Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#ui-configuration) on admin page)
|
|
||||||
|
|
||||||
#### Default admin login:
|
1. Open your browser and navigate to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog
|
||||||
*Username:* admin\
|
2. Log in with the default admin credentials
|
||||||
*Password:* admin123
|
3. If you don't have a Calibre database, you can use [this database](https://github.com/janeczku/calibre-web/raw/master/library/metadata.db) (move it out of the Calibre-Web folder to prevent overwriting during updates)
|
||||||
|
4. Set `Location of Calibre database` to the path of the folder containing your Calibre library (metadata.db) and click "Save"
|
||||||
|
5. Optionally, use Google Drive to host your Calibre library by following the [Google Drive integration guide](https://github.com/janeczku/calibre-web/wiki/G-Drive-Setup#using-google-drive-integration)
|
||||||
|
6. Configure your Calibre-Web instance via the admin page, referring to the [Basic Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#basic-configuration) and [UI Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#ui-configuration) guides
|
||||||
|
|
||||||
|
#### Default Admin Login:
|
||||||
|
- **Username:** admin
|
||||||
|
- **Password:** admin123
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
python 3.5+
|
- Python 3.5+
|
||||||
|
- [Imagemagick](https://imagemagick.org/script/download.php) for cover extraction from EPUBs (Windows users may need to install [Ghostscript](https://ghostscript.com/releases/gsdnld.html) for PDF cover extraction)
|
||||||
Optionally, to enable on-the-fly conversion from one ebook format to another when using the send-to-ereader feature, or during editing of ebooks metadata:
|
- Optional: [Calibre desktop program](https://calibre-ebook.com/download) for on-the-fly conversion and metadata editing (set "calibre's converter tool" path on the setup page)
|
||||||
|
- Optional: [Kepubify tool](https://github.com/pgaskin/kepubify/releases/latest) for Kobo device support (place the binary in `/opt/kepubify` on Linux or `C:\Program Files\kepubify` on Windows)
|
||||||
[Download and install](https://calibre-ebook.com/download) the Calibre desktop program for your platform and enter the folder including program name (normally /opt/calibre/ebook-convert, or C:\Program Files\calibre\ebook-convert.exe) in the field "calibre's converter tool" on the setup page.
|
|
||||||
|
|
||||||
[Download](https://github.com/pgaskin/kepubify/releases/latest) Kepubify tool for your platform and place the binary starting with `kepubify` in Linux: `/opt/kepubify` Windows: `C:\Program Files\kepubify`.
|
|
||||||
|
|
||||||
## Docker Images
|
## Docker Images
|
||||||
|
|
||||||
A pre-built Docker image is available in these Docker Hub repository (maintained by the LinuxServer team):
|
Pre-built Docker images are available in the following Docker Hub repositories (maintained by the LinuxServer team):
|
||||||
|
|
||||||
#### **LinuxServer - x64, armhf, aarch64**
|
#### **LinuxServer - x64, aarch64**
|
||||||
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web](https://hub.docker.com/r/linuxserver/calibre-web)
|
- [Docker Hub](https://hub.docker.com/r/linuxserver/calibre-web)
|
||||||
+ Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web)
|
- [GitHub](https://github.com/linuxserver/docker-calibre-web)
|
||||||
+ Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre)
|
- [GitHub - Optional Calibre layer](https://github.com/linuxserver/docker-mods/tree/universal-calibre)
|
||||||
|
|
||||||
This image has the option to pull in an extra docker manifest layer to include the Calibre `ebook-convert` binary. Just include the environmental variable `DOCKER_MODS=linuxserver/calibre-web:calibre` in your docker run/docker compose file. **(x64 only)**
|
Include the environment variable `DOCKER_MODS=linuxserver/mods:universal-calibre` in your Docker run/compose file to add the Calibre `ebook-convert` binary (x64 only). Omit this variable for a lightweight image.
|
||||||
|
|
||||||
If you do not need this functionality then this can be omitted, keeping the image as lightweight as possible.
|
|
||||||
|
|
||||||
Both the Calibre-Web and Calibre-Mod images are rebuilt automatically on new releases of Calibre-Web and Calibre respectively, and on updates to any included base image packages on a weekly basis if required.
|
|
||||||
+ The "path to convertertool" should be set to `/usr/bin/ebook-convert`
|
|
||||||
+ The "path to unrar" should be set to `/usr/bin/unrar`
|
|
||||||
|
|
||||||
# Contact
|
Both the Calibre-Web and Calibre-Mod images are automatically rebuilt on new releases and updates.
|
||||||
|
|
||||||
Just reach us out on [Discord](https://discord.gg/h2VsJ2NEfB)
|
- Set "path to convertertool" to `/usr/bin/ebook-convert`
|
||||||
|
- Set "path to unrar" to `/usr/bin/unrar`
|
||||||
|
|
||||||
For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki)
|
## Contributor Recognition
|
||||||
|
|
||||||
# Contributing to Calibre-Web
|
We would like to thank all the [contributors](https://github.com/janeczku/calibre-web/graphs/contributors) and maintainers of Calibre-Web for their valuable input and dedication to the project. Your contributions are greatly appreciated.
|
||||||
|
|
||||||
Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
## Contact
|
||||||
|
|
||||||
|
Join us on [Discord](https://discord.gg/h2VsJ2NEfB)
|
||||||
|
|
||||||
|
For more information, How To's, and FAQs, please visit the [Wiki](https://github.com/janeczku/calibre-web/wiki)
|
||||||
|
|
||||||
|
## Contributing to Calibre-Web
|
||||||
|
|
||||||
|
Check out our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
||||||
|
|
|
@ -38,6 +38,13 @@ To receive fixes for security vulnerabilities it is required to always upgrade t
|
||||||
| V 0.6.18 | Possible SQL Injection is prevented in user table Thanks to Iman Sharafaldin (Forward Security) |CVE-2022-30765|
|
| V 0.6.18 | Possible SQL Injection is prevented in user table Thanks to Iman Sharafaldin (Forward Security) |CVE-2022-30765|
|
||||||
| V 0.6.18 | The SSRF protection no longer can be bypassed by IPV6/IPV4 embedding. Thanks to @416e6e61 |CVE-2022-0939|
|
| V 0.6.18 | The SSRF protection no longer can be bypassed by IPV6/IPV4 embedding. Thanks to @416e6e61 |CVE-2022-0939|
|
||||||
| V 0.6.18 | The SSRF protection no longer can be bypassed to connect to other servers in the local network. Thanks to @michaellrowley |CVE-2022-0990|
|
| V 0.6.18 | The SSRF protection no longer can be bypassed to connect to other servers in the local network. Thanks to @michaellrowley |CVE-2022-0990|
|
||||||
|
| V 0.6.20 | Credentials for emails are now stored encrypted ||
|
||||||
|
| V 0.6.20 | Login is rate limited ||
|
||||||
|
| V 0.6.20 | Passwordstrength can be forced ||
|
||||||
|
| V 0.6.21 | SMTP server credentials are no longer returned to client ||
|
||||||
|
| V 0.6.21 | Cross-site scripting (XSS) stored in href bypasses filter using data wrapper no longer possible ||
|
||||||
|
| V 0.6.21 | Cross-site scripting (XSS) is no longer possible via pathchooser ||
|
||||||
|
| V 0.6.21 | Error Handling at non existent rating, language, and user downloaded books was fixed ||
|
||||||
|
|
||||||
|
|
||||||
## Statement regarding Log4j (CVE-2021-44228 and related)
|
## Statement regarding Log4j (CVE-2021-44228 and related)
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
[python: **.py]
|
[python: **.py]
|
||||||
|
|
||||||
|
# has to be executed with jinja2 >=2.9 to have autoescape enabled automatically
|
||||||
[jinja2: **/templates/**.*ml]
|
[jinja2: **/templates/**.*ml]
|
||||||
extensions=jinja2.ext.autoescape,jinja2.ext.with_
|
|
2
cps.py
|
@ -21,7 +21,7 @@ import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
|
|
||||||
# Add local path to sys.path so we can import cps
|
# Add local path to sys.path, so we can import cps
|
||||||
path = os.path.dirname(os.path.abspath(__file__))
|
path = os.path.dirname(os.path.abspath(__file__))
|
||||||
sys.path.insert(0, path)
|
sys.path.insert(0, path)
|
||||||
|
|
||||||
|
|
|
@ -21,15 +21,32 @@
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
|
||||||
from flask_login import LoginManager
|
from flask_login import LoginManager, confirm_login
|
||||||
from flask import session
|
from flask import session, current_app
|
||||||
|
from flask_login.utils import decode_cookie
|
||||||
|
from flask_login.signals import user_loaded_from_cookie
|
||||||
|
|
||||||
class MyLoginManager(LoginManager):
|
class MyLoginManager(LoginManager):
|
||||||
def _session_protection_failed(self):
|
def _session_protection_failed(self):
|
||||||
_session = session._get_current_object()
|
sess = session._get_current_object()
|
||||||
ident = self._session_identifier_generator()
|
ident = self._session_identifier_generator()
|
||||||
if(_session and not (len(_session) == 1
|
if(sess and not (len(sess) == 1
|
||||||
and _session.get('csrf_token', None))) and ident != _session.get('_id', None):
|
and sess.get('csrf_token', None))) and ident != sess.get('_id', None):
|
||||||
return super(). _session_protection_failed()
|
return super(). _session_protection_failed()
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
def _load_user_from_remember_cookie(self, cookie):
|
||||||
|
user_id = decode_cookie(cookie)
|
||||||
|
if user_id is not None:
|
||||||
|
session["_user_id"] = user_id
|
||||||
|
session["_fresh"] = False
|
||||||
|
user = None
|
||||||
|
if self._user_callback:
|
||||||
|
user = self._user_callback(user_id)
|
||||||
|
if user is not None:
|
||||||
|
app = current_app._get_current_object()
|
||||||
|
user_loaded_from_cookie.send(app, user=user)
|
||||||
|
# if session was restored from remember me cookie make login valid
|
||||||
|
confirm_login()
|
||||||
|
return user
|
||||||
|
return None
|
||||||
|
|
|
@ -36,11 +36,16 @@ from .reverseproxy import ReverseProxied
|
||||||
from .server import WebServer
|
from .server import WebServer
|
||||||
from .dep_check import dependency_check
|
from .dep_check import dependency_check
|
||||||
from .updater import Updater
|
from .updater import Updater
|
||||||
from .babel import babel
|
from .babel import babel, get_locale
|
||||||
from . import config_sql
|
from . import config_sql
|
||||||
from . import cache_buster
|
from . import cache_buster
|
||||||
from . import ub, db
|
from . import ub, db
|
||||||
|
|
||||||
|
try:
|
||||||
|
from flask_limiter import Limiter
|
||||||
|
limiter_present = True
|
||||||
|
except ImportError:
|
||||||
|
limiter_present = False
|
||||||
try:
|
try:
|
||||||
from flask_wtf.csrf import CSRFProtect
|
from flask_wtf.csrf import CSRFProtect
|
||||||
wtf_present = True
|
wtf_present = True
|
||||||
|
@ -59,7 +64,8 @@ mimetypes.add_type('application/x-mobi8-ebook', '.azw3')
|
||||||
mimetypes.add_type('application/x-cbr', '.cbr')
|
mimetypes.add_type('application/x-cbr', '.cbr')
|
||||||
mimetypes.add_type('application/x-cbz', '.cbz')
|
mimetypes.add_type('application/x-cbz', '.cbz')
|
||||||
mimetypes.add_type('application/x-cbt', '.cbt')
|
mimetypes.add_type('application/x-cbt', '.cbt')
|
||||||
mimetypes.add_type('image/vnd.djvu', '.djvu')
|
mimetypes.add_type('application/x-cb7', '.cb7')
|
||||||
|
mimetypes.add_type('image/vnd.djv', '.djv')
|
||||||
mimetypes.add_type('application/mpeg', '.mpeg')
|
mimetypes.add_type('application/mpeg', '.mpeg')
|
||||||
mimetypes.add_type('application/mpeg', '.mp3')
|
mimetypes.add_type('application/mpeg', '.mp3')
|
||||||
mimetypes.add_type('application/mp4', '.m4a')
|
mimetypes.add_type('application/mp4', '.m4a')
|
||||||
|
@ -81,10 +87,10 @@ app.config.update(
|
||||||
|
|
||||||
lm = MyLoginManager()
|
lm = MyLoginManager()
|
||||||
|
|
||||||
config = config_sql._ConfigSQL()
|
|
||||||
|
|
||||||
cli_param = CliParameter()
|
cli_param = CliParameter()
|
||||||
|
|
||||||
|
config = config_sql.ConfigSQL()
|
||||||
|
|
||||||
if wtf_present:
|
if wtf_present:
|
||||||
csrf = CSRFProtect()
|
csrf = CSRFProtect()
|
||||||
else:
|
else:
|
||||||
|
@ -96,32 +102,28 @@ web_server = WebServer()
|
||||||
|
|
||||||
updater_thread = Updater()
|
updater_thread = Updater()
|
||||||
|
|
||||||
|
if limiter_present:
|
||||||
|
limiter = Limiter(key_func=True, headers_enabled=True, auto_check=False, swallow_errors=False)
|
||||||
|
else:
|
||||||
|
limiter = None
|
||||||
|
|
||||||
def create_app():
|
def create_app():
|
||||||
lm.login_view = 'web.login'
|
|
||||||
lm.anonymous_user = ub.Anonymous
|
|
||||||
lm.session_protection = 'strong'
|
|
||||||
|
|
||||||
if csrf:
|
if csrf:
|
||||||
csrf.init_app(app)
|
csrf.init_app(app)
|
||||||
|
|
||||||
cli_param.init()
|
cli_param.init()
|
||||||
|
|
||||||
ub.init_db(cli_param.settings_path, cli_param.user_credentials)
|
ub.init_db(cli_param.settings_path)
|
||||||
|
|
||||||
# pylint: disable=no-member
|
# pylint: disable=no-member
|
||||||
config_sql.load_configuration(config, ub.session, cli_param)
|
encrypt_key, error = config_sql.get_encryption_key(os.path.dirname(cli_param.settings_path))
|
||||||
|
|
||||||
db.CalibreDB.update_config(config)
|
config_sql.load_configuration(ub.session, encrypt_key)
|
||||||
db.CalibreDB.setup_db(config.config_calibre_dir, cli_param.settings_path)
|
config.init_config(ub.session, encrypt_key, cli_param)
|
||||||
calibre_db.init_db()
|
|
||||||
|
|
||||||
updater_thread.init_updater(config, web_server)
|
if error:
|
||||||
# Perform dry run of updater and exit afterwards
|
log.error(error)
|
||||||
if cli_param.dry_run:
|
|
||||||
updater_thread.dry_run()
|
ub.password_change(cli_param.user_credentials)
|
||||||
sys.exit(0)
|
|
||||||
updater_thread.start()
|
|
||||||
|
|
||||||
if sys.version_info < (3, 0):
|
if sys.version_info < (3, 0):
|
||||||
log.info(
|
log.info(
|
||||||
|
@ -132,15 +134,32 @@ def create_app():
|
||||||
'please update your installation to Python3 ***')
|
'please update your installation to Python3 ***')
|
||||||
web_server.stop(True)
|
web_server.stop(True)
|
||||||
sys.exit(5)
|
sys.exit(5)
|
||||||
if not wtf_present:
|
|
||||||
log.info('*** "flask-WTF" is needed for calibre-web to run. '
|
lm.login_view = 'web.login'
|
||||||
'Please install it using pip: "pip install flask-WTF" ***')
|
lm.anonymous_user = ub.Anonymous
|
||||||
print('*** "flask-WTF" is needed for calibre-web to run. '
|
lm.session_protection = 'strong' if config.config_session == 1 else "basic"
|
||||||
'Please install it using pip: "pip install flask-WTF" ***')
|
|
||||||
web_server.stop(True)
|
db.CalibreDB.update_config(config)
|
||||||
sys.exit(7)
|
db.CalibreDB.setup_db(config.config_calibre_dir, cli_param.settings_path)
|
||||||
for res in dependency_check() + dependency_check(True):
|
calibre_db.init_db()
|
||||||
log.info('*** "{}" version does not fit the requirements. '
|
|
||||||
|
updater_thread.init_updater(config, web_server)
|
||||||
|
# Perform dry run of updater and exit afterward
|
||||||
|
if cli_param.dry_run:
|
||||||
|
updater_thread.dry_run()
|
||||||
|
sys.exit(0)
|
||||||
|
updater_thread.start()
|
||||||
|
requirements = dependency_check()
|
||||||
|
for res in requirements:
|
||||||
|
if res['found'] == "not installed":
|
||||||
|
message = ('Cannot import {name} module, it is needed to run calibre-web, '
|
||||||
|
'please install it using "pip install {name}"').format(name=res["name"])
|
||||||
|
log.info(message)
|
||||||
|
print("*** " + message + " ***")
|
||||||
|
web_server.stop(True)
|
||||||
|
sys.exit(8)
|
||||||
|
for res in requirements + dependency_check(True):
|
||||||
|
log.info('*** "{}" version does not meet the requirements. '
|
||||||
'Should: {}, Found: {}, please consider installing required version ***'
|
'Should: {}, Found: {}, please consider installing required version ***'
|
||||||
.format(res['name'],
|
.format(res['name'],
|
||||||
res['target'],
|
res['target'],
|
||||||
|
@ -150,14 +169,16 @@ def create_app():
|
||||||
if os.environ.get('FLASK_DEBUG'):
|
if os.environ.get('FLASK_DEBUG'):
|
||||||
cache_buster.init_cache_busting(app)
|
cache_buster.init_cache_busting(app)
|
||||||
log.info('Starting Calibre Web...')
|
log.info('Starting Calibre Web...')
|
||||||
|
|
||||||
Principal(app)
|
Principal(app)
|
||||||
lm.init_app(app)
|
lm.init_app(app)
|
||||||
app.secret_key = os.getenv('SECRET_KEY', config_sql.get_flask_session_key(ub.session))
|
app.secret_key = os.getenv('SECRET_KEY', config_sql.get_flask_session_key(ub.session))
|
||||||
|
|
||||||
web_server.init_app(app, config)
|
web_server.init_app(app, config)
|
||||||
|
if hasattr(babel, "localeselector"):
|
||||||
babel.init_app(app)
|
babel.init_app(app)
|
||||||
|
babel.localeselector(get_locale)
|
||||||
|
else:
|
||||||
|
babel.init_app(app, locale_selector=get_locale)
|
||||||
|
|
||||||
from . import services
|
from . import services
|
||||||
|
|
||||||
|
@ -165,9 +186,22 @@ def create_app():
|
||||||
services.ldap.init_app(app, config)
|
services.ldap.init_app(app, config)
|
||||||
if services.goodreads_support:
|
if services.goodreads_support:
|
||||||
services.goodreads_support.connect(config.config_goodreads_api_key,
|
services.goodreads_support.connect(config.config_goodreads_api_key,
|
||||||
config.config_goodreads_api_secret,
|
|
||||||
config.config_use_goodreads)
|
config.config_use_goodreads)
|
||||||
config.store_calibre_uuid(calibre_db, db.Library_Id)
|
config.store_calibre_uuid(calibre_db, db.Library_Id)
|
||||||
|
# Configure rate limiter
|
||||||
|
# https://limits.readthedocs.io/en/stable/storage.html
|
||||||
|
app.config.update(RATELIMIT_ENABLED=config.config_ratelimiter)
|
||||||
|
if config.config_limiter_uri != "" and not cli_param.memory_backend:
|
||||||
|
app.config.update(RATELIMIT_STORAGE_URI=config.config_limiter_uri)
|
||||||
|
if config.config_limiter_options != "":
|
||||||
|
app.config.update(RATELIMIT_STORAGE_OPTIONS=config.config_limiter_options)
|
||||||
|
try:
|
||||||
|
limiter.init_app(app)
|
||||||
|
except Exception as e:
|
||||||
|
log.error('Wrong Flask Limiter configuration, falling back to default: {}'.format(e))
|
||||||
|
app.config.update(RATELIMIT_STORAGE_URI=None)
|
||||||
|
limiter.init_app(app)
|
||||||
|
|
||||||
# Register scheduled tasks
|
# Register scheduled tasks
|
||||||
from .schedule import register_scheduled_tasks, register_startup_tasks
|
from .schedule import register_scheduled_tasks, register_startup_tasks
|
||||||
register_scheduled_tasks(config.schedule_reconnect)
|
register_scheduled_tasks(config.schedule_reconnect)
|
||||||
|
|
|
@ -49,9 +49,9 @@ sorted_modules = OrderedDict((sorted(modules.items(), key=lambda x: x[0].casefol
|
||||||
|
|
||||||
def collect_stats():
|
def collect_stats():
|
||||||
if constants.NIGHTLY_VERSION[0] == "$Format:%H$":
|
if constants.NIGHTLY_VERSION[0] == "$Format:%H$":
|
||||||
calibre_web_version = constants.STABLE_VERSION['version']
|
calibre_web_version = constants.STABLE_VERSION['version'].replace("b", " Beta")
|
||||||
else:
|
else:
|
||||||
calibre_web_version = (constants.STABLE_VERSION['version'] + ' - '
|
calibre_web_version = (constants.STABLE_VERSION['version'].replace("b", " Beta") + ' - '
|
||||||
+ constants.NIGHTLY_VERSION[0].replace('%', '%%') + ' - '
|
+ constants.NIGHTLY_VERSION[0].replace('%', '%%') + ' - '
|
||||||
+ constants.NIGHTLY_VERSION[1].replace('%', '%%'))
|
+ constants.NIGHTLY_VERSION[1].replace('%', '%%'))
|
||||||
|
|
||||||
|
@ -81,4 +81,4 @@ def stats():
|
||||||
categories = calibre_db.session.query(db.Tags).count()
|
categories = calibre_db.session.query(db.Tags).count()
|
||||||
series = calibre_db.session.query(db.Series).count()
|
series = calibre_db.session.query(db.Series).count()
|
||||||
return render_title_template('stats.html', bookcounter=counter, authorcounter=authors, versions=collect_stats(),
|
return render_title_template('stats.html', bookcounter=counter, authorcounter=authors, versions=collect_stats(),
|
||||||
categorycounter=categories, seriecounter=series, title=_(u"Statistics"), page="stat")
|
categorycounter=categories, seriecounter=series, title=_("Statistics"), page="stat")
|
||||||
|
|
404
cps/admin.py
|
@ -22,19 +22,21 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import base64
|
|
||||||
import json
|
import json
|
||||||
import operator
|
import operator
|
||||||
import time
|
import time
|
||||||
|
import sys
|
||||||
|
import string
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from datetime import time as datetime_time
|
from datetime import time as datetime_time
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
from flask import Blueprint, flash, redirect, url_for, abort, request, make_response, send_from_directory, g, Response
|
from flask import Blueprint, flash, redirect, url_for, abort, request, make_response, send_from_directory, g, Response
|
||||||
from flask_login import login_required, current_user, logout_user, confirm_login
|
from markupsafe import Markup
|
||||||
|
from flask_login import login_required, current_user, logout_user
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_babel import get_locale, format_time, format_datetime, format_timedelta
|
from flask_babel import get_locale, format_time, format_datetime, format_timedelta
|
||||||
from flask import session as flask_session
|
from flask import session as flask_session
|
||||||
from sqlalchemy import and_
|
from sqlalchemy import and_
|
||||||
from sqlalchemy.orm.attributes import flag_modified
|
from sqlalchemy.orm.attributes import flag_modified
|
||||||
|
@ -46,33 +48,35 @@ from . import db, calibre_db, ub, web_server, config, updater_thread, gdriveutil
|
||||||
kobo_sync_status, schedule
|
kobo_sync_status, schedule
|
||||||
from .helper import check_valid_domain, send_test_mail, reset_password, generate_password_hash, check_email, \
|
from .helper import check_valid_domain, send_test_mail, reset_password, generate_password_hash, check_email, \
|
||||||
valid_email, check_username
|
valid_email, check_username
|
||||||
|
from .embed_helper import get_calibre_binarypath
|
||||||
from .gdriveutils import is_gdrive_ready, gdrive_support
|
from .gdriveutils import is_gdrive_ready, gdrive_support
|
||||||
from .render_template import render_title_template, get_sidebar_config
|
from .render_template import render_title_template, get_sidebar_config
|
||||||
from .services.worker import WorkerThread
|
from .services.worker import WorkerThread
|
||||||
from .babel import get_available_translations, get_available_locale, get_user_locale_language
|
from .babel import get_available_translations, get_available_locale, get_user_locale_language
|
||||||
from . import debug_info
|
from . import debug_info
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
feature_support = {
|
feature_support = {
|
||||||
'ldap': bool(services.ldap),
|
'ldap': bool(services.ldap),
|
||||||
'goodreads': bool(services.goodreads_support),
|
'goodreads': bool(services.goodreads_support),
|
||||||
'kobo': bool(services.kobo),
|
'kobo': bool(services.kobo),
|
||||||
'updater': constants.UPDATER_AVAILABLE,
|
'updater': constants.UPDATER_AVAILABLE,
|
||||||
'gmail': bool(services.gmail),
|
'gmail': bool(services.gmail),
|
||||||
'scheduler': schedule.use_APScheduler,
|
'scheduler': schedule.use_APScheduler,
|
||||||
'gdrive': gdrive_support
|
'gdrive': gdrive_support
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import rarfile # pylint: disable=unused-import
|
import rarfile # pylint: disable=unused-import
|
||||||
|
|
||||||
feature_support['rar'] = True
|
feature_support['rar'] = True
|
||||||
except (ImportError, SyntaxError):
|
except (ImportError, SyntaxError):
|
||||||
feature_support['rar'] = False
|
feature_support['rar'] = False
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from .oauth_bb import oauth_check, oauthblueprints
|
from .oauth_bb import oauth_check, oauthblueprints
|
||||||
|
|
||||||
feature_support['oauth'] = True
|
feature_support['oauth'] = True
|
||||||
except ImportError as err:
|
except ImportError as err:
|
||||||
log.debug('Cannot import Flask-Dance, login with Oauth will not work: %s', err)
|
log.debug('Cannot import Flask-Dance, login with Oauth will not work: %s', err)
|
||||||
|
@ -80,7 +84,6 @@ except ImportError as err:
|
||||||
oauthblueprints = []
|
oauthblueprints = []
|
||||||
oauth_check = {}
|
oauth_check = {}
|
||||||
|
|
||||||
|
|
||||||
admi = Blueprint('admin', __name__)
|
admi = Blueprint('admin', __name__)
|
||||||
|
|
||||||
|
|
||||||
|
@ -100,25 +103,26 @@ def admin_required(f):
|
||||||
|
|
||||||
@admi.before_app_request
|
@admi.before_app_request
|
||||||
def before_request():
|
def before_request():
|
||||||
# make remember me function work
|
try:
|
||||||
if current_user.is_authenticated:
|
if not ub.check_user_session(current_user.id,
|
||||||
confirm_login()
|
flask_session.get('_id')) and 'opds' not in request.path \
|
||||||
if not ub.check_user_session(current_user.id, flask_session.get('_id')) and 'opds' not in request.path:
|
and config.config_session == 1:
|
||||||
logout_user()
|
logout_user()
|
||||||
|
except AttributeError:
|
||||||
|
pass # ? fails on requesting /ajax/emailstat during restart ?
|
||||||
g.constants = constants
|
g.constants = constants
|
||||||
g.user = current_user
|
g.google_site_verification = os.getenv('GOOGLE_SITE_VERIFICATION', '')
|
||||||
g.allow_registration = config.config_public_reg
|
g.allow_registration = config.config_public_reg
|
||||||
g.allow_anonymous = config.config_anonbrowse
|
g.allow_anonymous = config.config_anonbrowse
|
||||||
g.allow_upload = config.config_uploading
|
g.allow_upload = config.config_uploading
|
||||||
g.current_theme = config.config_theme
|
g.current_theme = config.config_theme
|
||||||
g.config_authors_max = config.config_authors_max
|
g.config_authors_max = config.config_authors_max
|
||||||
g.shelves_access = ub.session.query(ub.Shelf).filter(
|
|
||||||
or_(ub.Shelf.is_public == 1, ub.Shelf.user_id == current_user.id)).order_by(ub.Shelf.name).all()
|
|
||||||
if '/static/' not in request.path and not config.db_configured and \
|
if '/static/' not in request.path and not config.db_configured and \
|
||||||
request.endpoint not in ('admin.ajax_db_config',
|
request.endpoint not in ('admin.ajax_db_config',
|
||||||
'admin.simulatedbchange',
|
'admin.simulatedbchange',
|
||||||
'admin.db_configuration',
|
'admin.db_configuration',
|
||||||
'web.login',
|
'web.login',
|
||||||
|
'web.login_post',
|
||||||
'web.logout',
|
'web.logout',
|
||||||
'admin.load_dialogtexts',
|
'admin.load_dialogtexts',
|
||||||
'admin.ajax_pathchooser'):
|
'admin.ajax_pathchooser'):
|
||||||
|
@ -136,28 +140,39 @@ def admin_forbidden():
|
||||||
@admin_required
|
@admin_required
|
||||||
def shutdown():
|
def shutdown():
|
||||||
task = request.get_json().get('parameter', -1)
|
task = request.get_json().get('parameter', -1)
|
||||||
showtext = {}
|
show_text = {}
|
||||||
if task in (0, 1): # valid commandos received
|
if task in (0, 1): # valid commandos received
|
||||||
# close all database connections
|
# close all database connections
|
||||||
calibre_db.dispose()
|
calibre_db.dispose()
|
||||||
ub.dispose()
|
ub.dispose()
|
||||||
|
|
||||||
if task == 0:
|
if task == 0:
|
||||||
showtext['text'] = _(u'Server restarted, please reload page')
|
show_text['text'] = _('Server restarted, please reload page.')
|
||||||
else:
|
else:
|
||||||
showtext['text'] = _(u'Performing shutdown of server, please close window')
|
show_text['text'] = _('Performing Server shutdown, please close window.')
|
||||||
# stop gevent/tornado server
|
# stop gevent/tornado server
|
||||||
web_server.stop(task == 0)
|
web_server.stop(task == 0)
|
||||||
return json.dumps(showtext)
|
return json.dumps(show_text)
|
||||||
|
|
||||||
if task == 2:
|
if task == 2:
|
||||||
log.warning("reconnecting to calibre database")
|
log.warning("reconnecting to calibre database")
|
||||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
showtext['text'] = _(u'Reconnect successful')
|
show_text['text'] = _('Success! Database Reconnected')
|
||||||
return json.dumps(showtext)
|
return json.dumps(show_text)
|
||||||
|
|
||||||
showtext['text'] = _(u'Unknown command')
|
show_text['text'] = _('Unknown command')
|
||||||
return json.dumps(showtext), 400
|
return json.dumps(show_text), 400
|
||||||
|
|
||||||
|
|
||||||
|
@admi.route("/metadata_backup", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
@admin_required
|
||||||
|
def queue_metadata_backup():
|
||||||
|
show_text = {}
|
||||||
|
log.warning("Queuing all books for metadata backup")
|
||||||
|
helper.set_all_metadata_dirty()
|
||||||
|
show_text['text'] = _('Success! Books queued for Metadata Backup, please check Tasks for result')
|
||||||
|
return json.dumps(show_text)
|
||||||
|
|
||||||
|
|
||||||
# method is available without login and not protected by CSRF to make it easy reachable, is per default switched off
|
# method is available without login and not protected by CSRF to make it easy reachable, is per default switched off
|
||||||
|
@ -189,32 +204,32 @@ def update_thumbnails():
|
||||||
def admin():
|
def admin():
|
||||||
version = updater_thread.get_current_version_info()
|
version = updater_thread.get_current_version_info()
|
||||||
if version is False:
|
if version is False:
|
||||||
commit = _(u'Unknown')
|
commit = _('Unknown')
|
||||||
else:
|
else:
|
||||||
if 'datetime' in version:
|
if 'datetime' in version:
|
||||||
commit = version['datetime']
|
commit = version['datetime']
|
||||||
|
|
||||||
tz = timedelta(seconds=time.timezone if (time.localtime().tm_isdst == 0) else time.altzone)
|
tz = timedelta(seconds=time.timezone if (time.localtime().tm_isdst == 0) else time.altzone)
|
||||||
form_date = datetime.strptime(commit[:19], "%Y-%m-%dT%H:%M:%S")
|
form_date = datetime.strptime(commit[:19], "%Y-%m-%dT%H:%M:%S")
|
||||||
if len(commit) > 19: # check if string has timezone
|
if len(commit) > 19: # check if string has timezone
|
||||||
if commit[19] == '+':
|
if commit[19] == '+':
|
||||||
form_date -= timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
form_date -= timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
||||||
elif commit[19] == '-':
|
elif commit[19] == '-':
|
||||||
form_date += timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
form_date += timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
||||||
commit = format_datetime(form_date - tz, format='short')
|
commit = format_datetime(form_date - tz, format='short')
|
||||||
else:
|
else:
|
||||||
commit = version['version']
|
commit = version['version'].replace("b", " Beta")
|
||||||
|
|
||||||
all_user = ub.session.query(ub.User).all()
|
all_user = ub.session.query(ub.User).all()
|
||||||
email_settings = config.get_mail_settings()
|
# email_settings = mail_config.get_mail_settings()
|
||||||
schedule_time = format_time(datetime_time(hour=config.schedule_start_time), format="short")
|
schedule_time = format_time(datetime_time(hour=config.schedule_start_time), format="short")
|
||||||
t = timedelta(hours=config.schedule_duration // 60, minutes=config.schedule_duration % 60)
|
t = timedelta(hours=config.schedule_duration // 60, minutes=config.schedule_duration % 60)
|
||||||
schedule_duration = format_timedelta(t, threshold=.99)
|
schedule_duration = format_timedelta(t, threshold=.99)
|
||||||
|
|
||||||
return render_title_template("admin.html", allUser=all_user, email=email_settings, config=config, commit=commit,
|
return render_title_template("admin.html", allUser=all_user, config=config, commit=commit,
|
||||||
feature_support=feature_support, schedule_time=schedule_time,
|
feature_support=feature_support, schedule_time=schedule_time,
|
||||||
schedule_duration=schedule_duration,
|
schedule_duration=schedule_duration,
|
||||||
title=_(u"Admin page"), page="admin")
|
title=_("Admin page"), page="admin")
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/dbconfig", methods=["GET", "POST"])
|
@admi.route("/admin/dbconfig", methods=["GET", "POST"])
|
||||||
|
@ -234,7 +249,7 @@ def configuration():
|
||||||
config=config,
|
config=config,
|
||||||
provider=oauthblueprints,
|
provider=oauthblueprints,
|
||||||
feature_support=feature_support,
|
feature_support=feature_support,
|
||||||
title=_(u"Basic Configuration"), page="config")
|
title=_("Basic Configuration"), page="config")
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/ajaxconfig", methods=["POST"])
|
@admi.route("/admin/ajaxconfig", methods=["POST"])
|
||||||
|
@ -262,9 +277,9 @@ def calibreweb_alive():
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def view_configuration():
|
def view_configuration():
|
||||||
read_column = calibre_db.session.query(db.CustomColumns)\
|
read_column = calibre_db.session.query(db.CustomColumns) \
|
||||||
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all()
|
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all()
|
||||||
restrict_columns = calibre_db.session.query(db.CustomColumns)\
|
restrict_columns = calibre_db.session.query(db.CustomColumns) \
|
||||||
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all()
|
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all()
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
translations = get_available_locale()
|
translations = get_available_locale()
|
||||||
|
@ -272,7 +287,7 @@ def view_configuration():
|
||||||
restrictColumns=restrict_columns,
|
restrictColumns=restrict_columns,
|
||||||
languages=languages,
|
languages=languages,
|
||||||
translations=translations,
|
translations=translations,
|
||||||
title=_(u"UI Configuration"), page="uiconfig")
|
title=_("UI Configuration"), page="uiconfig")
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/usertable")
|
@admi.route("/admin/usertable")
|
||||||
|
@ -283,11 +298,11 @@ def edit_user_table():
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
translations = get_available_locale()
|
translations = get_available_locale()
|
||||||
all_user = ub.session.query(ub.User)
|
all_user = ub.session.query(ub.User)
|
||||||
tags = calibre_db.session.query(db.Tags)\
|
tags = calibre_db.session.query(db.Tags) \
|
||||||
.join(db.books_tags_link)\
|
.join(db.books_tags_link) \
|
||||||
.join(db.Books)\
|
.join(db.Books) \
|
||||||
.filter(calibre_db.common_filters()) \
|
.filter(calibre_db.common_filters()) \
|
||||||
.group_by(text('books_tags_link.tag'))\
|
.group_by(text('books_tags_link.tag')) \
|
||||||
.order_by(db.Tags.name).all()
|
.order_by(db.Tags.name).all()
|
||||||
if config.config_restricted_column:
|
if config.config_restricted_column:
|
||||||
custom_values = calibre_db.session.query(db.cc_classes[config.config_restricted_column]).all()
|
custom_values = calibre_db.session.query(db.cc_classes[config.config_restricted_column]).all()
|
||||||
|
@ -306,7 +321,7 @@ def edit_user_table():
|
||||||
all_roles=constants.ALL_ROLES,
|
all_roles=constants.ALL_ROLES,
|
||||||
kobo_support=kobo_support,
|
kobo_support=kobo_support,
|
||||||
sidebar_settings=constants.sidebar_settings,
|
sidebar_settings=constants.sidebar_settings,
|
||||||
title=_(u"Edit Users"),
|
title=_("Edit Users"),
|
||||||
page="usertable")
|
page="usertable")
|
||||||
|
|
||||||
|
|
||||||
|
@ -464,20 +479,20 @@ def edit_list_user(param):
|
||||||
elif param.endswith('role'):
|
elif param.endswith('role'):
|
||||||
value = int(vals['field_index'])
|
value = int(vals['field_index'])
|
||||||
if user.name == "Guest" and value in \
|
if user.name == "Guest" and value in \
|
||||||
[constants.ROLE_ADMIN, constants.ROLE_PASSWD, constants.ROLE_EDIT_SHELFS]:
|
[constants.ROLE_ADMIN, constants.ROLE_PASSWD, constants.ROLE_EDIT_SHELFS]:
|
||||||
raise Exception(_("Guest can't have this role"))
|
raise Exception(_("Guest can't have this role"))
|
||||||
# check for valid value, last on checks for power of 2 value
|
# check for valid value, last on checks for power of 2 value
|
||||||
if value > 0 and value <= constants.ROLE_VIEWER and (value & value-1 == 0 or value == 1):
|
if value > 0 and value <= constants.ROLE_VIEWER and (value & value - 1 == 0 or value == 1):
|
||||||
if vals['value'] == 'true':
|
if vals['value'] == 'true':
|
||||||
user.role |= value
|
user.role |= value
|
||||||
elif vals['value'] == 'false':
|
elif vals['value'] == 'false':
|
||||||
if value == constants.ROLE_ADMIN:
|
if value == constants.ROLE_ADMIN:
|
||||||
if not ub.session.query(ub.User).\
|
if not ub.session.query(ub.User). \
|
||||||
filter(ub.User.role.op('&')(constants.ROLE_ADMIN) == constants.ROLE_ADMIN,
|
filter(ub.User.role.op('&')(constants.ROLE_ADMIN) == constants.ROLE_ADMIN,
|
||||||
ub.User.id != user.id).count():
|
ub.User.id != user.id).count():
|
||||||
return Response(
|
return Response(
|
||||||
json.dumps([{'type': "danger",
|
json.dumps([{'type': "danger",
|
||||||
'message': _(u"No admin user remaining, can't remove admin role",
|
'message': _("No admin user remaining, can't remove admin role",
|
||||||
nick=user.name)}]), mimetype='application/json')
|
nick=user.name)}]), mimetype='application/json')
|
||||||
user.role &= ~value
|
user.role &= ~value
|
||||||
else:
|
else:
|
||||||
|
@ -489,7 +504,7 @@ def edit_list_user(param):
|
||||||
if user.name == "Guest" and value == constants.SIDEBAR_READ_AND_UNREAD:
|
if user.name == "Guest" and value == constants.SIDEBAR_READ_AND_UNREAD:
|
||||||
raise Exception(_("Guest can't have this view"))
|
raise Exception(_("Guest can't have this view"))
|
||||||
# check for valid value, last on checks for power of 2 value
|
# check for valid value, last on checks for power of 2 value
|
||||||
if value > 0 and value <= constants.SIDEBAR_LIST and (value & value-1 == 0 or value == 1):
|
if value > 0 and value <= constants.SIDEBAR_LIST and (value & value - 1 == 0 or value == 1):
|
||||||
if vals['value'] == 'true':
|
if vals['value'] == 'true':
|
||||||
user.sidebar_view |= value
|
user.sidebar_view |= value
|
||||||
elif vals['value'] == 'false':
|
elif vals['value'] == 'false':
|
||||||
|
@ -554,13 +569,13 @@ def update_view_configuration():
|
||||||
calibre_db.update_title_sort(config)
|
calibre_db.update_title_sort(config)
|
||||||
|
|
||||||
if not check_valid_read_column(to_save.get("config_read_column", "0")):
|
if not check_valid_read_column(to_save.get("config_read_column", "0")):
|
||||||
flash(_(u"Invalid Read Column"), category="error")
|
flash(_("Invalid Read Column"), category="error")
|
||||||
log.debug("Invalid Read column")
|
log.debug("Invalid Read column")
|
||||||
return view_configuration()
|
return view_configuration()
|
||||||
_config_int(to_save, "config_read_column")
|
_config_int(to_save, "config_read_column")
|
||||||
|
|
||||||
if not check_valid_restricted_column(to_save.get("config_restricted_column", "0")):
|
if not check_valid_restricted_column(to_save.get("config_restricted_column", "0")):
|
||||||
flash(_(u"Invalid Restricted Column"), category="error")
|
flash(_("Invalid Restricted Column"), category="error")
|
||||||
log.debug("Invalid Restricted Column")
|
log.debug("Invalid Restricted Column")
|
||||||
return view_configuration()
|
return view_configuration()
|
||||||
_config_int(to_save, "config_restricted_column")
|
_config_int(to_save, "config_restricted_column")
|
||||||
|
@ -580,7 +595,7 @@ def update_view_configuration():
|
||||||
config.config_default_show |= constants.DETAIL_RANDOM
|
config.config_default_show |= constants.DETAIL_RANDOM
|
||||||
|
|
||||||
config.save()
|
config.save()
|
||||||
flash(_(u"Calibre-Web configuration updated"), category="success")
|
flash(_("Calibre-Web configuration updated"), category="success")
|
||||||
log.debug("Calibre-Web configuration updated")
|
log.debug("Calibre-Web configuration updated")
|
||||||
before_request()
|
before_request()
|
||||||
|
|
||||||
|
@ -642,7 +657,7 @@ def edit_domain(allow):
|
||||||
@admin_required
|
@admin_required
|
||||||
def add_domain(allow):
|
def add_domain(allow):
|
||||||
domain_name = request.form.to_dict()['domainname'].replace('*', '%').replace('?', '_').lower()
|
domain_name = request.form.to_dict()['domainname'].replace('*', '%').replace('?', '_').lower()
|
||||||
check = ub.session.query(ub.Registration).filter(ub.Registration.domain == domain_name)\
|
check = ub.session.query(ub.Registration).filter(ub.Registration.domain == domain_name) \
|
||||||
.filter(ub.Registration.allow == allow).first()
|
.filter(ub.Registration.allow == allow).first()
|
||||||
if not check:
|
if not check:
|
||||||
new_domain = ub.Registration(domain=domain_name, allow=allow)
|
new_domain = ub.Registration(domain=domain_name, allow=allow)
|
||||||
|
@ -860,16 +875,16 @@ def delete_restriction(res_type, user_id):
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def list_restriction(res_type, user_id):
|
def list_restriction(res_type, user_id):
|
||||||
if res_type == 0: # Tags as template
|
if res_type == 0: # Tags as template
|
||||||
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd'+str(i)}
|
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd' + str(i)}
|
||||||
for i, x in enumerate(config.list_denied_tags()) if x != '']
|
for i, x in enumerate(config.list_denied_tags()) if x != '']
|
||||||
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a'+str(i)}
|
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a' + str(i)}
|
||||||
for i, x in enumerate(config.list_allowed_tags()) if x != '']
|
for i, x in enumerate(config.list_allowed_tags()) if x != '']
|
||||||
json_dumps = restrict + allow
|
json_dumps = restrict + allow
|
||||||
elif res_type == 1: # CustomC as template
|
elif res_type == 1: # CustomC as template
|
||||||
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd'+str(i)}
|
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd' + str(i)}
|
||||||
for i, x in enumerate(config.list_denied_column_values()) if x != '']
|
for i, x in enumerate(config.list_denied_column_values()) if x != '']
|
||||||
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a'+str(i)}
|
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a' + str(i)}
|
||||||
for i, x in enumerate(config.list_allowed_column_values()) if x != '']
|
for i, x in enumerate(config.list_allowed_column_values()) if x != '']
|
||||||
json_dumps = restrict + allow
|
json_dumps = restrict + allow
|
||||||
elif res_type == 2: # Tags per user
|
elif res_type == 2: # Tags per user
|
||||||
|
@ -877,9 +892,9 @@ def list_restriction(res_type, user_id):
|
||||||
usr = ub.session.query(ub.User).filter(ub.User.id == user_id).first()
|
usr = ub.session.query(ub.User).filter(ub.User.id == user_id).first()
|
||||||
else:
|
else:
|
||||||
usr = current_user
|
usr = current_user
|
||||||
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd'+str(i)}
|
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd' + str(i)}
|
||||||
for i, x in enumerate(usr.list_denied_tags()) if x != '']
|
for i, x in enumerate(usr.list_denied_tags()) if x != '']
|
||||||
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a'+str(i)}
|
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a' + str(i)}
|
||||||
for i, x in enumerate(usr.list_allowed_tags()) if x != '']
|
for i, x in enumerate(usr.list_allowed_tags()) if x != '']
|
||||||
json_dumps = restrict + allow
|
json_dumps = restrict + allow
|
||||||
elif res_type == 3: # CustomC per user
|
elif res_type == 3: # CustomC per user
|
||||||
|
@ -887,9 +902,9 @@ def list_restriction(res_type, user_id):
|
||||||
usr = ub.session.query(ub.User).filter(ub.User.id == user_id).first()
|
usr = ub.session.query(ub.User).filter(ub.User.id == user_id).first()
|
||||||
else:
|
else:
|
||||||
usr = current_user
|
usr = current_user
|
||||||
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd'+str(i)}
|
restrict = [{'Element': x, 'type': _('Deny'), 'id': 'd' + str(i)}
|
||||||
for i, x in enumerate(usr.list_denied_column_values()) if x != '']
|
for i, x in enumerate(usr.list_denied_column_values()) if x != '']
|
||||||
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a'+str(i)}
|
allow = [{'Element': x, 'type': _('Allow'), 'id': 'a' + str(i)}
|
||||||
for i, x in enumerate(usr.list_allowed_column_values()) if x != '']
|
for i, x in enumerate(usr.list_allowed_column_values()) if x != '']
|
||||||
json_dumps = restrict + allow
|
json_dumps = restrict + allow
|
||||||
else:
|
else:
|
||||||
|
@ -902,11 +917,15 @@ def list_restriction(res_type, user_id):
|
||||||
|
|
||||||
@admi.route("/ajax/fullsync", methods=["POST"])
|
@admi.route("/ajax/fullsync", methods=["POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def ajax_fullsync():
|
def ajax_self_fullsync():
|
||||||
count = ub.session.query(ub.KoboSyncedBooks).filter(current_user.id == ub.KoboSyncedBooks.user_id).delete()
|
return do_full_kobo_sync(current_user.id)
|
||||||
message = _("{} sync entries deleted").format(count)
|
|
||||||
ub.session_commit(message)
|
|
||||||
return Response(json.dumps([{"type": "success", "message": message}]), mimetype='application/json')
|
@admi.route("/ajax/fullsync/<int:userid>", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
@admin_required
|
||||||
|
def ajax_fullsync(userid):
|
||||||
|
return do_full_kobo_sync(userid)
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/ajax/pathchooser/")
|
@admi.route("/ajax/pathchooser/")
|
||||||
|
@ -916,10 +935,17 @@ def ajax_pathchooser():
|
||||||
return pathchooser()
|
return pathchooser()
|
||||||
|
|
||||||
|
|
||||||
|
def do_full_kobo_sync(userid):
|
||||||
|
count = ub.session.query(ub.KoboSyncedBooks).filter(userid == ub.KoboSyncedBooks.user_id).delete()
|
||||||
|
message = _("{} sync entries deleted").format(count)
|
||||||
|
ub.session_commit(message)
|
||||||
|
return Response(json.dumps([{"type": "success", "message": message}]), mimetype='application/json')
|
||||||
|
|
||||||
|
|
||||||
def check_valid_read_column(column):
|
def check_valid_read_column(column):
|
||||||
if column != "0":
|
if column != "0":
|
||||||
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
||||||
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all():
|
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all():
|
||||||
return False
|
return False
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@ -927,7 +953,7 @@ def check_valid_read_column(column):
|
||||||
def check_valid_restricted_column(column):
|
def check_valid_restricted_column(column):
|
||||||
if column != "0":
|
if column != "0":
|
||||||
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
||||||
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all():
|
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all():
|
||||||
return False
|
return False
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@ -955,7 +981,7 @@ def prepare_tags(user, action, tags_name, id_list):
|
||||||
raise Exception(_("Tag not found"))
|
raise Exception(_("Tag not found"))
|
||||||
new_tags_list = [x.name for x in tags]
|
new_tags_list = [x.name for x in tags]
|
||||||
else:
|
else:
|
||||||
tags = calibre_db.session.query(db.cc_classes[config.config_restricted_column])\
|
tags = calibre_db.session.query(db.cc_classes[config.config_restricted_column]) \
|
||||||
.filter(db.cc_classes[config.config_restricted_column].id.in_(id_list)).all()
|
.filter(db.cc_classes[config.config_restricted_column].id.in_(id_list)).all()
|
||||||
new_tags_list = [x.value for x in tags]
|
new_tags_list = [x.value for x in tags]
|
||||||
saved_tags_list = user.__dict__[tags_name].split(",") if len(user.__dict__[tags_name]) else []
|
saved_tags_list = user.__dict__[tags_name].split(",") if len(user.__dict__[tags_name]) else []
|
||||||
|
@ -968,6 +994,19 @@ def prepare_tags(user, action, tags_name, id_list):
|
||||||
return ",".join(saved_tags_list)
|
return ",".join(saved_tags_list)
|
||||||
|
|
||||||
|
|
||||||
|
def get_drives(current):
|
||||||
|
drive_letters = []
|
||||||
|
for d in string.ascii_uppercase:
|
||||||
|
if os.path.exists('{}:'.format(d)) and current[0].lower() != d.lower():
|
||||||
|
drive = "{}:\\".format(d)
|
||||||
|
data = {"name": drive, "fullpath": drive}
|
||||||
|
data["sort"] = "_" + data["fullpath"].lower()
|
||||||
|
data["type"] = "dir"
|
||||||
|
data["size"] = ""
|
||||||
|
drive_letters.append(data)
|
||||||
|
return drive_letters
|
||||||
|
|
||||||
|
|
||||||
def pathchooser():
|
def pathchooser():
|
||||||
browse_for = "folder"
|
browse_for = "folder"
|
||||||
folder_only = request.args.get('folder', False) == "true"
|
folder_only = request.args.get('folder', False) == "true"
|
||||||
|
@ -975,43 +1014,45 @@ def pathchooser():
|
||||||
path = os.path.normpath(request.args.get('path', ""))
|
path = os.path.normpath(request.args.get('path', ""))
|
||||||
|
|
||||||
if os.path.isfile(path):
|
if os.path.isfile(path):
|
||||||
oldfile = path
|
old_file = path
|
||||||
path = os.path.dirname(path)
|
path = os.path.dirname(path)
|
||||||
else:
|
else:
|
||||||
oldfile = ""
|
old_file = ""
|
||||||
|
|
||||||
absolute = False
|
absolute = False
|
||||||
|
|
||||||
if os.path.isdir(path):
|
if os.path.isdir(path):
|
||||||
# if os.path.isabs(path):
|
|
||||||
cwd = os.path.realpath(path)
|
cwd = os.path.realpath(path)
|
||||||
absolute = True
|
absolute = True
|
||||||
# else:
|
|
||||||
# cwd = os.path.relpath(path)
|
|
||||||
else:
|
else:
|
||||||
cwd = os.getcwd()
|
cwd = os.getcwd()
|
||||||
|
|
||||||
cwd = os.path.normpath(os.path.realpath(cwd))
|
cwd = os.path.normpath(os.path.realpath(cwd))
|
||||||
parentdir = os.path.dirname(cwd)
|
parent_dir = os.path.dirname(cwd)
|
||||||
if not absolute:
|
if not absolute:
|
||||||
if os.path.realpath(cwd) == os.path.realpath("/"):
|
if os.path.realpath(cwd) == os.path.realpath("/"):
|
||||||
cwd = os.path.relpath(cwd)
|
cwd = os.path.relpath(cwd)
|
||||||
else:
|
else:
|
||||||
cwd = os.path.relpath(cwd) + os.path.sep
|
cwd = os.path.relpath(cwd) + os.path.sep
|
||||||
parentdir = os.path.relpath(parentdir) + os.path.sep
|
parent_dir = os.path.relpath(parent_dir) + os.path.sep
|
||||||
|
|
||||||
if os.path.realpath(cwd) == os.path.realpath("/"):
|
files = []
|
||||||
parentdir = ""
|
if os.path.realpath(cwd) == os.path.realpath("/") \
|
||||||
|
or (sys.platform == "win32" and os.path.realpath(cwd)[1:] == os.path.realpath("/")[1:]):
|
||||||
|
# we are in root
|
||||||
|
parent_dir = ""
|
||||||
|
if sys.platform == "win32":
|
||||||
|
files = get_drives(cwd)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
folders = os.listdir(cwd)
|
folders = os.listdir(cwd)
|
||||||
except Exception:
|
except Exception:
|
||||||
folders = []
|
folders = []
|
||||||
|
|
||||||
files = []
|
|
||||||
for f in folders:
|
for f in folders:
|
||||||
try:
|
try:
|
||||||
data = {"name": f, "fullpath": os.path.join(cwd, f)}
|
sanitized_f = str(Markup.escape(f))
|
||||||
|
data = {"name": sanitized_f, "fullpath": os.path.join(cwd, sanitized_f)}
|
||||||
data["sort"] = data["fullpath"].lower()
|
data["sort"] = data["fullpath"].lower()
|
||||||
except Exception:
|
except Exception:
|
||||||
continue
|
continue
|
||||||
|
@ -1041,9 +1082,9 @@ def pathchooser():
|
||||||
context = {
|
context = {
|
||||||
"cwd": cwd,
|
"cwd": cwd,
|
||||||
"files": files,
|
"files": files,
|
||||||
"parentdir": parentdir,
|
"parentdir": parent_dir,
|
||||||
"type": browse_for,
|
"type": browse_for,
|
||||||
"oldfile": oldfile,
|
"oldfile": old_file,
|
||||||
"absolute": absolute,
|
"absolute": absolute,
|
||||||
}
|
}
|
||||||
return json.dumps(context)
|
return json.dumps(context)
|
||||||
|
@ -1062,7 +1103,7 @@ def _config_checkbox_int(to_save, x):
|
||||||
|
|
||||||
|
|
||||||
def _config_string(to_save, x):
|
def _config_string(to_save, x):
|
||||||
return config.set_from_dictionary(to_save, x, lambda y: y.strip() if y else y)
|
return config.set_from_dictionary(to_save, x, lambda y: y.strip().strip(u'\u200B\u200C\u200D\ufeff') if y else y)
|
||||||
|
|
||||||
|
|
||||||
def _configuration_gdrive_helper(to_save):
|
def _configuration_gdrive_helper(to_save):
|
||||||
|
@ -1081,10 +1122,10 @@ def _configuration_gdrive_helper(to_save):
|
||||||
if not gdrive_secrets:
|
if not gdrive_secrets:
|
||||||
return _configuration_result(_('client_secrets.json Is Not Configured For Web Application'))
|
return _configuration_result(_('client_secrets.json Is Not Configured For Web Application'))
|
||||||
gdriveutils.update_settings(
|
gdriveutils.update_settings(
|
||||||
gdrive_secrets['client_id'],
|
gdrive_secrets['client_id'],
|
||||||
gdrive_secrets['client_secret'],
|
gdrive_secrets['client_secret'],
|
||||||
gdrive_secrets['redirect_uris'][0]
|
gdrive_secrets['redirect_uris'][0]
|
||||||
)
|
)
|
||||||
|
|
||||||
# always show Google Drive settings, but in case of error deny support
|
# always show Google Drive settings, but in case of error deny support
|
||||||
new_gdrive_value = (not gdrive_error) and ("config_use_google_drive" in to_save)
|
new_gdrive_value = (not gdrive_error) and ("config_use_google_drive" in to_save)
|
||||||
|
@ -1101,12 +1142,12 @@ def _configuration_oauth_helper(to_save):
|
||||||
reboot_required = False
|
reboot_required = False
|
||||||
for element in oauthblueprints:
|
for element in oauthblueprints:
|
||||||
if to_save["config_" + str(element['id']) + "_oauth_client_id"] != element['oauth_client_id'] \
|
if to_save["config_" + str(element['id']) + "_oauth_client_id"] != element['oauth_client_id'] \
|
||||||
or to_save["config_" + str(element['id']) + "_oauth_client_secret"] != element['oauth_client_secret']:
|
or to_save["config_" + str(element['id']) + "_oauth_client_secret"] != element['oauth_client_secret']:
|
||||||
reboot_required = True
|
reboot_required = True
|
||||||
element['oauth_client_id'] = to_save["config_" + str(element['id']) + "_oauth_client_id"]
|
element['oauth_client_id'] = to_save["config_" + str(element['id']) + "_oauth_client_id"]
|
||||||
element['oauth_client_secret'] = to_save["config_" + str(element['id']) + "_oauth_client_secret"]
|
element['oauth_client_secret'] = to_save["config_" + str(element['id']) + "_oauth_client_secret"]
|
||||||
if to_save["config_" + str(element['id']) + "_oauth_client_id"] \
|
if to_save["config_" + str(element['id']) + "_oauth_client_id"] \
|
||||||
and to_save["config_" + str(element['id']) + "_oauth_client_secret"]:
|
and to_save["config_" + str(element['id']) + "_oauth_client_secret"]:
|
||||||
active_oauths += 1
|
active_oauths += 1
|
||||||
element["active"] = 1
|
element["active"] = 1
|
||||||
else:
|
else:
|
||||||
|
@ -1136,7 +1177,6 @@ def _configuration_logfile_helper(to_save):
|
||||||
|
|
||||||
def _configuration_ldap_helper(to_save):
|
def _configuration_ldap_helper(to_save):
|
||||||
reboot_required = False
|
reboot_required = False
|
||||||
reboot_required |= _config_string(to_save, "config_ldap_provider_url")
|
|
||||||
reboot_required |= _config_int(to_save, "config_ldap_port")
|
reboot_required |= _config_int(to_save, "config_ldap_port")
|
||||||
reboot_required |= _config_int(to_save, "config_ldap_authentication")
|
reboot_required |= _config_int(to_save, "config_ldap_authentication")
|
||||||
reboot_required |= _config_string(to_save, "config_ldap_dn")
|
reboot_required |= _config_string(to_save, "config_ldap_dn")
|
||||||
|
@ -1151,21 +1191,26 @@ def _configuration_ldap_helper(to_save):
|
||||||
reboot_required |= _config_string(to_save, "config_ldap_cert_path")
|
reboot_required |= _config_string(to_save, "config_ldap_cert_path")
|
||||||
reboot_required |= _config_string(to_save, "config_ldap_key_path")
|
reboot_required |= _config_string(to_save, "config_ldap_key_path")
|
||||||
_config_string(to_save, "config_ldap_group_name")
|
_config_string(to_save, "config_ldap_group_name")
|
||||||
if to_save.get("config_ldap_serv_password", "") != "":
|
|
||||||
|
address = urlparse(to_save.get("config_ldap_provider_url", ""))
|
||||||
|
to_save["config_ldap_provider_url"] = (address.hostname or address.path).strip("/")
|
||||||
|
reboot_required |= _config_string(to_save, "config_ldap_provider_url")
|
||||||
|
|
||||||
|
if to_save.get("config_ldap_serv_password_e", "") != "":
|
||||||
reboot_required |= 1
|
reboot_required |= 1
|
||||||
config.set_from_dictionary(to_save, "config_ldap_serv_password", base64.b64encode, encode='UTF-8')
|
config.set_from_dictionary(to_save, "config_ldap_serv_password_e")
|
||||||
config.save()
|
config.save()
|
||||||
|
|
||||||
if not config.config_ldap_provider_url \
|
if not config.config_ldap_provider_url \
|
||||||
or not config.config_ldap_port \
|
or not config.config_ldap_port \
|
||||||
or not config.config_ldap_dn \
|
or not config.config_ldap_dn \
|
||||||
or not config.config_ldap_user_object:
|
or not config.config_ldap_user_object:
|
||||||
return reboot_required, _configuration_result(_('Please Enter a LDAP Provider, '
|
return reboot_required, _configuration_result(_('Please Enter a LDAP Provider, '
|
||||||
'Port, DN and User Object Identifier'))
|
'Port, DN and User Object Identifier'))
|
||||||
|
|
||||||
if config.config_ldap_authentication > constants.LDAP_AUTH_ANONYMOUS:
|
if config.config_ldap_authentication > constants.LDAP_AUTH_ANONYMOUS:
|
||||||
if config.config_ldap_authentication > constants.LDAP_AUTH_UNAUTHENTICATE:
|
if config.config_ldap_authentication > constants.LDAP_AUTH_UNAUTHENTICATE:
|
||||||
if not config.config_ldap_serv_username or not bool(config.config_ldap_serv_password):
|
if not config.config_ldap_serv_username or not bool(config.config_ldap_serv_password_e):
|
||||||
return reboot_required, _configuration_result(_('Please Enter a LDAP Service Account and Password'))
|
return reboot_required, _configuration_result(_('Please Enter a LDAP Service Account and Password'))
|
||||||
else:
|
else:
|
||||||
if not config.config_ldap_serv_username:
|
if not config.config_ldap_serv_username:
|
||||||
|
@ -1229,16 +1274,16 @@ def new_user():
|
||||||
content.default_language = config.config_default_language
|
content.default_language = config.config_default_language
|
||||||
return render_title_template("user_edit.html", new_user=1, content=content,
|
return render_title_template("user_edit.html", new_user=1, content=content,
|
||||||
config=config, translations=translations,
|
config=config, translations=translations,
|
||||||
languages=languages, title=_(u"Add new user"), page="newuser",
|
languages=languages, title=_("Add New User"), page="newuser",
|
||||||
kobo_support=kobo_support, registered_oauth=oauth_check)
|
kobo_support=kobo_support, registered_oauth=oauth_check)
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/mailsettings")
|
@admi.route("/admin/mailsettings", methods=["GET"])
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def edit_mailsettings():
|
def edit_mailsettings():
|
||||||
content = config.get_mail_settings()
|
content = config.get_mail_settings()
|
||||||
return render_title_template("email_edit.html", content=content, title=_(u"Edit E-mail Server Settings"),
|
return render_title_template("email_edit.html", content=content, title=_("Edit Email Server Settings"),
|
||||||
page="mailset", feature_support=feature_support)
|
page="mailset", feature_support=feature_support)
|
||||||
|
|
||||||
|
|
||||||
|
@ -1257,7 +1302,7 @@ def update_mailsettings():
|
||||||
elif to_save.get("gmail"):
|
elif to_save.get("gmail"):
|
||||||
try:
|
try:
|
||||||
config.mail_gmail_token = services.gmail.setup_gmail(config.mail_gmail_token)
|
config.mail_gmail_token = services.gmail.setup_gmail(config.mail_gmail_token)
|
||||||
flash(_(u"Gmail Account Verification Successful"), category="success")
|
flash(_("Success! Gmail Account Verified."), category="success")
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
flash(str(ex), category="error")
|
flash(str(ex), category="error")
|
||||||
log.error(ex)
|
log.error(ex)
|
||||||
|
@ -1266,8 +1311,9 @@ def update_mailsettings():
|
||||||
else:
|
else:
|
||||||
_config_int(to_save, "mail_port")
|
_config_int(to_save, "mail_port")
|
||||||
_config_int(to_save, "mail_use_ssl")
|
_config_int(to_save, "mail_use_ssl")
|
||||||
_config_string(to_save, "mail_password")
|
if to_save.get("mail_password_e", ""):
|
||||||
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
_config_string(to_save, "mail_password_e")
|
||||||
|
_config_int(to_save, "mail_size", lambda y: int(y) * 1024 * 1024)
|
||||||
config.mail_server = to_save.get('mail_server', "").strip()
|
config.mail_server = to_save.get('mail_server', "").strip()
|
||||||
config.mail_from = to_save.get('mail_from', "").strip()
|
config.mail_from = to_save.get('mail_from', "").strip()
|
||||||
config.mail_login = to_save.get('mail_login', "").strip()
|
config.mail_login = to_save.get('mail_login', "").strip()
|
||||||
|
@ -1276,24 +1322,24 @@ def update_mailsettings():
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
return edit_mailsettings()
|
return edit_mailsettings()
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
return edit_mailsettings()
|
return edit_mailsettings()
|
||||||
|
|
||||||
if to_save.get("test"):
|
if to_save.get("test"):
|
||||||
if current_user.email:
|
if current_user.email:
|
||||||
result = send_test_mail(current_user.email, current_user.name)
|
result = send_test_mail(current_user.email, current_user.name)
|
||||||
if result is None:
|
if result is None:
|
||||||
flash(_(u"Test e-mail queued for sending to %(email)s, please check Tasks for result",
|
flash(_("Test e-mail queued for sending to %(email)s, please check Tasks for result",
|
||||||
email=current_user.email), category="info")
|
email=current_user.email), category="info")
|
||||||
else:
|
else:
|
||||||
flash(_(u"There was an error sending the Test e-mail: %(res)s", res=result), category="error")
|
flash(_("There was an error sending the Test e-mail: %(res)s", res=result), category="error")
|
||||||
else:
|
else:
|
||||||
flash(_(u"Please configure your e-mail address first..."), category="error")
|
flash(_("Please configure your e-mail address first..."), category="error")
|
||||||
else:
|
else:
|
||||||
flash(_(u"E-mail server settings updated"), category="success")
|
flash(_("Email Server Settings updated"), category="success")
|
||||||
|
|
||||||
return edit_mailsettings()
|
return edit_mailsettings()
|
||||||
|
|
||||||
|
@ -1307,16 +1353,16 @@ def edit_scheduledtasks():
|
||||||
duration_field = list()
|
duration_field = list()
|
||||||
|
|
||||||
for n in range(24):
|
for n in range(24):
|
||||||
time_field.append((n, format_time(datetime_time(hour=n), format="short",)))
|
time_field.append((n, format_time(datetime_time(hour=n), format="short", )))
|
||||||
for n in range(5, 65, 5):
|
for n in range(5, 65, 5):
|
||||||
t = timedelta(hours=n // 60, minutes=n % 60)
|
t = timedelta(hours=n // 60, minutes=n % 60)
|
||||||
duration_field.append((n, format_timedelta(t, threshold=.9)))
|
duration_field.append((n, format_timedelta(t, threshold=.97)))
|
||||||
|
|
||||||
return render_title_template("schedule_edit.html",
|
return render_title_template("schedule_edit.html",
|
||||||
config=content,
|
config=content,
|
||||||
starttime=time_field,
|
starttime=time_field,
|
||||||
duration=duration_field,
|
duration=duration_field,
|
||||||
title=_(u"Edit Scheduled Tasks Settings"))
|
title=_("Edit Scheduled Tasks Settings"))
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/scheduledtasks", methods=["POST"])
|
@admi.route("/admin/scheduledtasks", methods=["POST"])
|
||||||
|
@ -1326,23 +1372,24 @@ def update_scheduledtasks():
|
||||||
error = False
|
error = False
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
if 0 <= int(to_save.get("schedule_start_time")) <= 23:
|
if 0 <= int(to_save.get("schedule_start_time")) <= 23:
|
||||||
_config_int(to_save, "schedule_start_time")
|
_config_int( to_save, "schedule_start_time")
|
||||||
else:
|
else:
|
||||||
flash(_(u"Invalid start time for task specified"), category="error")
|
flash(_("Invalid start time for task specified"), category="error")
|
||||||
error = True
|
error = True
|
||||||
if 0 < int(to_save.get("schedule_duration")) <= 60:
|
if 0 < int(to_save.get("schedule_duration")) <= 60:
|
||||||
_config_int(to_save, "schedule_duration")
|
_config_int(to_save, "schedule_duration")
|
||||||
else:
|
else:
|
||||||
flash(_(u"Invalid duration for task specified"), category="error")
|
flash(_("Invalid duration for task specified"), category="error")
|
||||||
error = True
|
error = True
|
||||||
_config_checkbox(to_save, "schedule_generate_book_covers")
|
_config_checkbox(to_save, "schedule_generate_book_covers")
|
||||||
_config_checkbox(to_save, "schedule_generate_series_covers")
|
_config_checkbox(to_save, "schedule_generate_series_covers")
|
||||||
|
_config_checkbox(to_save, "schedule_metadata_backup")
|
||||||
_config_checkbox(to_save, "schedule_reconnect")
|
_config_checkbox(to_save, "schedule_reconnect")
|
||||||
|
|
||||||
if not error:
|
if not error:
|
||||||
try:
|
try:
|
||||||
config.save()
|
config.save()
|
||||||
flash(_(u"Scheduled tasks settings updated"), category="success")
|
flash(_("Scheduled tasks settings updated"), category="success")
|
||||||
|
|
||||||
# Cancel any running tasks
|
# Cancel any running tasks
|
||||||
schedule.end_scheduled_tasks()
|
schedule.end_scheduled_tasks()
|
||||||
|
@ -1352,7 +1399,7 @@ def update_scheduledtasks():
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("An unknown error occurred while saving scheduled tasks settings")
|
log.error("An unknown error occurred while saving scheduled tasks settings")
|
||||||
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
flash(_("Oops! An unknown error occurred. Please try again later."), category="error")
|
||||||
except OperationalError:
|
except OperationalError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error("Settings DB is not Writeable")
|
||||||
|
@ -1367,7 +1414,7 @@ def update_scheduledtasks():
|
||||||
def edit_user(user_id):
|
def edit_user(user_id):
|
||||||
content = ub.session.query(ub.User).filter(ub.User.id == int(user_id)).first() # type: ub.User
|
content = ub.session.query(ub.User).filter(ub.User.id == int(user_id)).first() # type: ub.User
|
||||||
if not content or (not config.config_anonbrowse and content.name == "Guest"):
|
if not content or (not config.config_anonbrowse and content.name == "Guest"):
|
||||||
flash(_(u"User not found"), category="error")
|
flash(_("User not found"), category="error")
|
||||||
return redirect(url_for('admin.admin'))
|
return redirect(url_for('admin.admin'))
|
||||||
languages = calibre_db.speaking_language(return_all_languages=True)
|
languages = calibre_db.speaking_language(return_all_languages=True)
|
||||||
translations = get_available_locale()
|
translations = get_available_locale()
|
||||||
|
@ -1386,7 +1433,7 @@ def edit_user(user_id):
|
||||||
registered_oauth=oauth_check,
|
registered_oauth=oauth_check,
|
||||||
mail_configured=config.get_mail_server_configured(),
|
mail_configured=config.get_mail_server_configured(),
|
||||||
kobo_support=kobo_support,
|
kobo_support=kobo_support,
|
||||||
title=_(u"Edit User %(nick)s", nick=content.name),
|
title=_("Edit User %(nick)s", nick=content.name),
|
||||||
page="edituser")
|
page="edituser")
|
||||||
|
|
||||||
|
|
||||||
|
@ -1397,14 +1444,14 @@ def reset_user_password(user_id):
|
||||||
if current_user is not None and current_user.is_authenticated:
|
if current_user is not None and current_user.is_authenticated:
|
||||||
ret, message = reset_password(user_id)
|
ret, message = reset_password(user_id)
|
||||||
if ret == 1:
|
if ret == 1:
|
||||||
log.debug(u"Password for user %s reset", message)
|
log.debug("Password for user %s reset", message)
|
||||||
flash(_(u"Password for user %(user)s reset", user=message), category="success")
|
flash(_("Success! Password for user %(user)s reset", user=message), category="success")
|
||||||
elif ret == 0:
|
elif ret == 0:
|
||||||
log.error(u"An unknown error occurred. Please try again later.")
|
log.error("An unknown error occurred. Please try again later.")
|
||||||
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
flash(_("Oops! An unknown error occurred. Please try again later."), category="error")
|
||||||
else:
|
else:
|
||||||
log.error(u"Please configure the SMTP mail settings first...")
|
log.error("Please configure the SMTP mail settings.")
|
||||||
flash(_(u"Please configure the SMTP mail settings first..."), category="error")
|
flash(_("Oops! Please configure the SMTP mail settings."), category="error")
|
||||||
return redirect(url_for('admin.admin'))
|
return redirect(url_for('admin.admin'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -1415,7 +1462,7 @@ def view_logfile():
|
||||||
logfiles = {0: logger.get_logfile(config.config_logfile),
|
logfiles = {0: logger.get_logfile(config.config_logfile),
|
||||||
1: logger.get_accesslogfile(config.config_access_logfile)}
|
1: logger.get_accesslogfile(config.config_access_logfile)}
|
||||||
return render_title_template("logviewer.html",
|
return render_title_template("logviewer.html",
|
||||||
title=_(u"Logfile viewer"),
|
title=_("Logfile viewer"),
|
||||||
accesslog_enable=config.config_access_log,
|
accesslog_enable=config.config_access_log,
|
||||||
log_enable=bool(config.config_logfile != logger.LOG_TO_STDOUT),
|
log_enable=bool(config.config_logfile != logger.LOG_TO_STDOUT),
|
||||||
logfiles=logfiles,
|
logfiles=logfiles,
|
||||||
|
@ -1465,7 +1512,7 @@ def download_debug():
|
||||||
@admin_required
|
@admin_required
|
||||||
def get_update_status():
|
def get_update_status():
|
||||||
if feature_support['updater']:
|
if feature_support['updater']:
|
||||||
log.info(u"Update status requested")
|
log.info("Update status requested")
|
||||||
return updater_thread.get_available_updates(request.method)
|
return updater_thread.get_available_updates(request.method)
|
||||||
else:
|
else:
|
||||||
return ''
|
return ''
|
||||||
|
@ -1558,7 +1605,7 @@ def ldap_import_create_user(user, user_data):
|
||||||
ub.session.add(content)
|
ub.session.add(content)
|
||||||
try:
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
return 1, None # increase no of users
|
return 1, None # increase no of users
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.warning("Failed to create LDAP user: %s - %s", user, ex)
|
log.warning("Failed to create LDAP user: %s - %s", user, ex)
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
|
@ -1584,7 +1631,10 @@ def import_ldap_users():
|
||||||
|
|
||||||
imported = 0
|
imported = 0
|
||||||
for username in new_users:
|
for username in new_users:
|
||||||
user = username.decode('utf-8')
|
if isinstance(username, bytes):
|
||||||
|
user = username.decode('utf-8')
|
||||||
|
else:
|
||||||
|
user = username
|
||||||
if '=' in user:
|
if '=' in user:
|
||||||
# if member object field is empty take user object as filter
|
# if member object field is empty take user object as filter
|
||||||
if config.config_ldap_member_user_object:
|
if config.config_ldap_member_user_object:
|
||||||
|
@ -1660,7 +1710,7 @@ def _db_configuration_update_helper():
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
_db_configuration_result(_(u"Database error: %(error)s.", error=e.orig), gdrive_error)
|
_db_configuration_result(_("Oops! Database Error: %(error)s.", error=e.orig), gdrive_error)
|
||||||
try:
|
try:
|
||||||
metadata_db = os.path.join(to_save['config_calibre_dir'], "metadata.db")
|
metadata_db = os.path.join(to_save['config_calibre_dir'], "metadata.db")
|
||||||
if config.config_use_google_drive and is_gdrive_ready() and not os.path.exists(metadata_db):
|
if config.config_use_google_drive and is_gdrive_ready() and not os.path.exists(metadata_db):
|
||||||
|
@ -1670,7 +1720,7 @@ def _db_configuration_update_helper():
|
||||||
return _db_configuration_result('{}'.format(ex), gdrive_error)
|
return _db_configuration_result('{}'.format(ex), gdrive_error)
|
||||||
|
|
||||||
if db_change or not db_valid or not config.db_configured \
|
if db_change or not db_valid or not config.db_configured \
|
||||||
or config.config_calibre_dir != to_save["config_calibre_dir"]:
|
or config.config_calibre_dir != to_save["config_calibre_dir"]:
|
||||||
if not os.path.exists(metadata_db) or not to_save['config_calibre_dir']:
|
if not os.path.exists(metadata_db) or not to_save['config_calibre_dir']:
|
||||||
return _db_configuration_result(_('DB Location is not Valid, Please Enter Correct Path'), gdrive_error)
|
return _db_configuration_result(_('DB Location is not Valid, Please Enter Correct Path'), gdrive_error)
|
||||||
else:
|
else:
|
||||||
|
@ -1692,7 +1742,10 @@ def _db_configuration_update_helper():
|
||||||
_config_string(to_save, "config_calibre_dir")
|
_config_string(to_save, "config_calibre_dir")
|
||||||
calibre_db.update_config(config)
|
calibre_db.update_config(config)
|
||||||
if not os.access(os.path.join(config.config_calibre_dir, "metadata.db"), os.W_OK):
|
if not os.access(os.path.join(config.config_calibre_dir, "metadata.db"), os.W_OK):
|
||||||
flash(_(u"DB is not Writeable"), category="warning")
|
flash(_("DB is not Writeable"), category="warning")
|
||||||
|
_config_string(to_save, "config_calibre_split_dir")
|
||||||
|
config.config_calibre_split = to_save.get('config_calibre_split', 0) == "on"
|
||||||
|
calibre_db.update_config(config)
|
||||||
config.save()
|
config.save()
|
||||||
return _db_configuration_result(None, gdrive_error)
|
return _db_configuration_result(None, gdrive_error)
|
||||||
|
|
||||||
|
@ -1713,6 +1766,7 @@ def _configuration_update_helper():
|
||||||
|
|
||||||
_config_checkbox_int(to_save, "config_uploading")
|
_config_checkbox_int(to_save, "config_uploading")
|
||||||
_config_checkbox_int(to_save, "config_unicode_filename")
|
_config_checkbox_int(to_save, "config_unicode_filename")
|
||||||
|
_config_checkbox_int(to_save, "config_embed_metadata")
|
||||||
# Reboot on config_anonbrowse with enabled ldap, as decoraters are changed in this case
|
# Reboot on config_anonbrowse with enabled ldap, as decoraters are changed in this case
|
||||||
reboot_required |= (_config_checkbox_int(to_save, "config_anonbrowse")
|
reboot_required |= (_config_checkbox_int(to_save, "config_anonbrowse")
|
||||||
and config.config_login_type == constants.LOGIN_LDAP)
|
and config.config_login_type == constants.LOGIN_LDAP)
|
||||||
|
@ -1729,8 +1783,14 @@ def _configuration_update_helper():
|
||||||
constants.EXTENSIONS_UPLOAD = config.config_upload_formats.split(',')
|
constants.EXTENSIONS_UPLOAD = config.config_upload_formats.split(',')
|
||||||
|
|
||||||
_config_string(to_save, "config_calibre")
|
_config_string(to_save, "config_calibre")
|
||||||
_config_string(to_save, "config_converterpath")
|
_config_string(to_save, "config_binariesdir")
|
||||||
_config_string(to_save, "config_kepubifypath")
|
_config_string(to_save, "config_kepubifypath")
|
||||||
|
if "config_binariesdir" in to_save:
|
||||||
|
calibre_status = helper.check_calibre(config.config_binariesdir)
|
||||||
|
if calibre_status:
|
||||||
|
return _configuration_result(calibre_status)
|
||||||
|
to_save["config_converterpath"] = get_calibre_binarypath("ebook-convert")
|
||||||
|
_config_string(to_save, "config_converterpath")
|
||||||
|
|
||||||
reboot_required |= _config_int(to_save, "config_login_type")
|
reboot_required |= _config_int(to_save, "config_login_type")
|
||||||
|
|
||||||
|
@ -1749,10 +1809,8 @@ def _configuration_update_helper():
|
||||||
# Goodreads configuration
|
# Goodreads configuration
|
||||||
_config_checkbox(to_save, "config_use_goodreads")
|
_config_checkbox(to_save, "config_use_goodreads")
|
||||||
_config_string(to_save, "config_goodreads_api_key")
|
_config_string(to_save, "config_goodreads_api_key")
|
||||||
_config_string(to_save, "config_goodreads_api_secret")
|
|
||||||
if services.goodreads_support:
|
if services.goodreads_support:
|
||||||
services.goodreads_support.connect(config.config_goodreads_api_key,
|
services.goodreads_support.connect(config.config_goodreads_api_key,
|
||||||
config.config_goodreads_api_secret,
|
|
||||||
config.config_use_goodreads)
|
config.config_use_goodreads)
|
||||||
|
|
||||||
_config_int(to_save, "config_updatechannel")
|
_config_int(to_save, "config_updatechannel")
|
||||||
|
@ -1765,10 +1823,28 @@ def _configuration_update_helper():
|
||||||
if config.config_login_type == constants.LOGIN_OAUTH:
|
if config.config_login_type == constants.LOGIN_OAUTH:
|
||||||
reboot_required |= _configuration_oauth_helper(to_save)
|
reboot_required |= _configuration_oauth_helper(to_save)
|
||||||
|
|
||||||
|
# logfile configuration
|
||||||
reboot, message = _configuration_logfile_helper(to_save)
|
reboot, message = _configuration_logfile_helper(to_save)
|
||||||
if message:
|
if message:
|
||||||
return message
|
return message
|
||||||
reboot_required |= reboot
|
reboot_required |= reboot
|
||||||
|
|
||||||
|
# security configuration
|
||||||
|
_config_checkbox(to_save, "config_password_policy")
|
||||||
|
_config_checkbox(to_save, "config_password_number")
|
||||||
|
_config_checkbox(to_save, "config_password_lower")
|
||||||
|
_config_checkbox(to_save, "config_password_upper")
|
||||||
|
_config_checkbox(to_save, "config_password_character")
|
||||||
|
_config_checkbox(to_save, "config_password_special")
|
||||||
|
if 0 < int(to_save.get("config_password_min_length", "0")) < 41:
|
||||||
|
_config_int(to_save, "config_password_min_length")
|
||||||
|
else:
|
||||||
|
return _configuration_result(_('Password length has to be between 1 and 40'))
|
||||||
|
reboot_required |= _config_int(to_save, "config_session")
|
||||||
|
reboot_required |= _config_checkbox(to_save, "config_ratelimiter")
|
||||||
|
reboot_required |= _config_string(to_save, "config_limiter_uri")
|
||||||
|
reboot_required |= _config_string(to_save, "config_limiter_options")
|
||||||
|
|
||||||
# Rarfile Content configuration
|
# Rarfile Content configuration
|
||||||
_config_string(to_save, "config_rarfile_location")
|
_config_string(to_save, "config_rarfile_location")
|
||||||
if "config_rarfile_location" in to_save:
|
if "config_rarfile_location" in to_save:
|
||||||
|
@ -1778,7 +1854,7 @@ def _configuration_update_helper():
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
_configuration_result(_(u"Database error: %(error)s.", error=e.orig))
|
_configuration_result(_("Oops! Database Error: %(error)s.", error=e.orig))
|
||||||
|
|
||||||
config.save()
|
config.save()
|
||||||
if reboot_required:
|
if reboot_required:
|
||||||
|
@ -1794,7 +1870,7 @@ def _configuration_result(error_flash=None, reboot=False):
|
||||||
config.load()
|
config.load()
|
||||||
resp['result'] = [{'type': "danger", 'message': error_flash}]
|
resp['result'] = [{'type': "danger", 'message': error_flash}]
|
||||||
else:
|
else:
|
||||||
resp['result'] = [{'type': "success", 'message': _(u"Calibre-Web configuration updated")}]
|
resp['result'] = [{'type': "success", 'message': _("Calibre-Web configuration updated")}]
|
||||||
resp['reboot'] = reboot
|
resp['reboot'] = reboot
|
||||||
resp['config_upload'] = config.config_upload_formats
|
resp['config_upload'] = config.config_upload_formats
|
||||||
return Response(json.dumps(resp), mimetype='application/json')
|
return Response(json.dumps(resp), mimetype='application/json')
|
||||||
|
@ -1825,7 +1901,7 @@ def _db_configuration_result(error_flash=None, gdrive_error=None):
|
||||||
gdriveError=gdrive_error,
|
gdriveError=gdrive_error,
|
||||||
gdrivefolders=gdrivefolders,
|
gdrivefolders=gdrivefolders,
|
||||||
feature_support=feature_support,
|
feature_support=feature_support,
|
||||||
title=_(u"Database Configuration"), page="dbconfig")
|
title=_("Database Configuration"), page="dbconfig")
|
||||||
|
|
||||||
|
|
||||||
def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
||||||
|
@ -1837,11 +1913,11 @@ def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
||||||
content.sidebar_view |= constants.DETAIL_RANDOM
|
content.sidebar_view |= constants.DETAIL_RANDOM
|
||||||
|
|
||||||
content.role = constants.selected_roles(to_save)
|
content.role = constants.selected_roles(to_save)
|
||||||
content.password = generate_password_hash(to_save["password"])
|
|
||||||
try:
|
try:
|
||||||
if not to_save["name"] or not to_save["email"] or not to_save["password"]:
|
if not to_save["name"] or not to_save["email"] or not to_save["password"]:
|
||||||
log.info("Missing entries on new user")
|
log.info("Missing entries on new user")
|
||||||
raise Exception(_(u"Please fill out all fields!"))
|
raise Exception(_("Oops! Please complete all fields."))
|
||||||
|
content.password = generate_password_hash(helper.valid_password(to_save.get("password", "")))
|
||||||
content.email = check_email(to_save["email"])
|
content.email = check_email(to_save["email"])
|
||||||
# Query username, if not existing, change
|
# Query username, if not existing, change
|
||||||
content.name = check_username(to_save["name"])
|
content.name = check_username(to_save["name"])
|
||||||
|
@ -1849,13 +1925,13 @@ def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
||||||
content.kindle_mail = valid_email(to_save["kindle_mail"])
|
content.kindle_mail = valid_email(to_save["kindle_mail"])
|
||||||
if config.config_public_reg and not check_valid_domain(content.email):
|
if config.config_public_reg and not check_valid_domain(content.email):
|
||||||
log.info("E-mail: {} for new user is not from valid domain".format(content.email))
|
log.info("E-mail: {} for new user is not from valid domain".format(content.email))
|
||||||
raise Exception(_(u"E-mail is not from valid domain"))
|
raise Exception(_("E-mail is not from valid domain"))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
flash(str(ex), category="error")
|
flash(str(ex), category="error")
|
||||||
return render_title_template("user_edit.html", new_user=1, content=content,
|
return render_title_template("user_edit.html", new_user=1, content=content,
|
||||||
config=config,
|
config=config,
|
||||||
translations=translations,
|
translations=translations,
|
||||||
languages=languages, title=_(u"Add new user"), page="newuser",
|
languages=languages, title=_("Add new user"), page="newuser",
|
||||||
kobo_support=kobo_support, registered_oauth=oauth_check)
|
kobo_support=kobo_support, registered_oauth=oauth_check)
|
||||||
try:
|
try:
|
||||||
content.allowed_tags = config.config_allowed_tags
|
content.allowed_tags = config.config_allowed_tags
|
||||||
|
@ -1866,17 +1942,17 @@ def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
||||||
content.kobo_only_shelves_sync = to_save.get("kobo_only_shelves_sync", 0) == "on"
|
content.kobo_only_shelves_sync = to_save.get("kobo_only_shelves_sync", 0) == "on"
|
||||||
ub.session.add(content)
|
ub.session.add(content)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"User '%(user)s' created", user=content.name), category="success")
|
flash(_("User '%(user)s' created", user=content.name), category="success")
|
||||||
log.debug("User {} created".format(content.name))
|
log.debug("User {} created".format(content.name))
|
||||||
return redirect(url_for('admin.admin'))
|
return redirect(url_for('admin.admin'))
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Found an existing account for {} or {}".format(content.name, content.email))
|
log.error("Found an existing account for {} or {}".format(content.name, content.email))
|
||||||
flash(_("Found an existing account for this e-mail address or name."), category="error")
|
flash(_("Oops! An account already exists for this Email. or name."), category="error")
|
||||||
except OperationalError as e:
|
except OperationalError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
|
|
||||||
def _delete_user(content):
|
def _delete_user(content):
|
||||||
|
@ -1904,10 +1980,10 @@ def _delete_user(content):
|
||||||
log.info("User {} deleted".format(content.name))
|
log.info("User {} deleted".format(content.name))
|
||||||
return _("User '%(nick)s' deleted", nick=content.name)
|
return _("User '%(nick)s' deleted", nick=content.name)
|
||||||
else:
|
else:
|
||||||
log.warning(_("Can't delete Guest User"))
|
# log.warning(_("Can't delete Guest User"))
|
||||||
raise Exception(_("Can't delete Guest User"))
|
raise Exception(_("Can't delete Guest User"))
|
||||||
else:
|
else:
|
||||||
log.warning("No admin user remaining, can't delete user")
|
# log.warning("No admin user remaining, can't delete user")
|
||||||
raise Exception(_("No admin user remaining, can't delete user"))
|
raise Exception(_("No admin user remaining, can't delete user"))
|
||||||
|
|
||||||
|
|
||||||
|
@ -1925,14 +2001,6 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
log.warning("No admin user remaining, can't remove admin role from {}".format(content.name))
|
log.warning("No admin user remaining, can't remove admin role from {}".format(content.name))
|
||||||
flash(_("No admin user remaining, can't remove admin role"), category="error")
|
flash(_("No admin user remaining, can't remove admin role"), category="error")
|
||||||
return redirect(url_for('admin.admin'))
|
return redirect(url_for('admin.admin'))
|
||||||
if to_save.get("password"):
|
|
||||||
content.password = generate_password_hash(to_save["password"])
|
|
||||||
anonymous = content.is_anonymous
|
|
||||||
content.role = constants.selected_roles(to_save)
|
|
||||||
if anonymous:
|
|
||||||
content.role |= constants.ROLE_ANONYMOUS
|
|
||||||
else:
|
|
||||||
content.role &= ~constants.ROLE_ANONYMOUS
|
|
||||||
|
|
||||||
val = [int(k[5:]) for k in to_save if k.startswith('show_')]
|
val = [int(k[5:]) for k in to_save if k.startswith('show_')]
|
||||||
sidebar, __ = get_sidebar_config()
|
sidebar, __ = get_sidebar_config()
|
||||||
|
@ -1960,8 +2028,20 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
if to_save.get("locale"):
|
if to_save.get("locale"):
|
||||||
content.locale = to_save["locale"]
|
content.locale = to_save["locale"]
|
||||||
try:
|
try:
|
||||||
if to_save.get("email", content.email) != content.email:
|
anonymous = content.is_anonymous
|
||||||
content.email = check_email(to_save["email"])
|
content.role = constants.selected_roles(to_save)
|
||||||
|
if anonymous:
|
||||||
|
content.role |= constants.ROLE_ANONYMOUS
|
||||||
|
else:
|
||||||
|
content.role &= ~constants.ROLE_ANONYMOUS
|
||||||
|
if to_save.get("password", ""):
|
||||||
|
content.password = generate_password_hash(helper.valid_password(to_save.get("password", "")))
|
||||||
|
|
||||||
|
new_email = valid_email(to_save.get("email", content.email))
|
||||||
|
if not new_email:
|
||||||
|
raise Exception(_("Email can't be empty and has to be a valid Email"))
|
||||||
|
if new_email != content.email:
|
||||||
|
content.email = check_email(new_email)
|
||||||
# Query username, if not existing, change
|
# Query username, if not existing, change
|
||||||
if to_save.get("name", content.name) != content.name:
|
if to_save.get("name", content.name) != content.name:
|
||||||
if to_save.get("name") == "Guest":
|
if to_save.get("name") == "Guest":
|
||||||
|
@ -1981,19 +2061,19 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
content=content,
|
content=content,
|
||||||
config=config,
|
config=config,
|
||||||
registered_oauth=oauth_check,
|
registered_oauth=oauth_check,
|
||||||
title=_(u"Edit User %(nick)s", nick=content.name),
|
title=_("Edit User %(nick)s", nick=content.name),
|
||||||
page="edituser")
|
page="edituser")
|
||||||
try:
|
try:
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
flash(_(u"User '%(nick)s' updated", nick=content.name), category="success")
|
flash(_("User '%(nick)s' updated", nick=content.name), category="success")
|
||||||
except IntegrityError as ex:
|
except IntegrityError as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("An unknown error occurred while changing user: {}".format(str(ex)))
|
log.error("An unknown error occurred while changing user: {}".format(str(ex)))
|
||||||
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
flash(_("Oops! An unknown error occurred. Please try again later."), category="error")
|
||||||
except OperationalError as e:
|
except OperationalError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
|
13
cps/babel.py
|
@ -1,7 +1,8 @@
|
||||||
from babel import negotiate_locale
|
from babel import negotiate_locale
|
||||||
from flask_babel import Babel, Locale
|
from flask_babel import Babel, Locale
|
||||||
from babel.core import UnknownLocaleError
|
from babel.core import UnknownLocaleError
|
||||||
from flask import request, g
|
from flask import request
|
||||||
|
from flask_login import current_user
|
||||||
|
|
||||||
from . import logger
|
from . import logger
|
||||||
|
|
||||||
|
@ -9,14 +10,12 @@ log = logger.create()
|
||||||
|
|
||||||
babel = Babel()
|
babel = Babel()
|
||||||
|
|
||||||
|
|
||||||
@babel.localeselector
|
|
||||||
def get_locale():
|
def get_locale():
|
||||||
# if a user is logged in, use the locale from the user settings
|
# if a user is logged in, use the locale from the user settings
|
||||||
user = getattr(g, 'user', None)
|
if current_user is not None and hasattr(current_user, "locale"):
|
||||||
if user is not None and hasattr(user, "locale"):
|
# if the account is the guest account bypass the config lang settings
|
||||||
if user.name != 'Guest': # if the account is the guest account bypass the config lang settings
|
if current_user.name != 'Guest':
|
||||||
return user.locale
|
return current_user.locale
|
||||||
|
|
||||||
preferred = list()
|
preferred = list()
|
||||||
if request.accept_languages:
|
if request.accept_languages:
|
||||||
|
|
|
@ -0,0 +1,53 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2018-2019 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from . import logger
|
||||||
|
from lxml.etree import ParserError
|
||||||
|
|
||||||
|
try:
|
||||||
|
# at least bleach 6.0 is needed -> incomplatible change from list arguments to set arguments
|
||||||
|
from bleach import clean_text as clean_html
|
||||||
|
BLEACH = True
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
BLEACH = False
|
||||||
|
from nh3 import clean as clean_html
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
BLEACH = False
|
||||||
|
from lxml.html.clean import clean_html
|
||||||
|
except ImportError:
|
||||||
|
clean_html = None
|
||||||
|
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
|
def clean_string(unsafe_text, book_id=0):
|
||||||
|
try:
|
||||||
|
if BLEACH:
|
||||||
|
safe_text = clean_html(unsafe_text, tags=set(), attributes=set())
|
||||||
|
else:
|
||||||
|
safe_text = clean_html(unsafe_text)
|
||||||
|
except ParserError as e:
|
||||||
|
log.error("Comments of book {} are corrupted: {}".format(book_id, e))
|
||||||
|
safe_text = ""
|
||||||
|
except TypeError as e:
|
||||||
|
log.error("Comments can't be parsed, maybe 'lxml' is too new, try installing 'bleach': {}".format(e))
|
||||||
|
safe_text = ""
|
||||||
|
return safe_text
|
|
@ -29,8 +29,8 @@ from .constants import DEFAULT_SETTINGS_FILE, DEFAULT_GDRIVE_FILE
|
||||||
|
|
||||||
def version_info():
|
def version_info():
|
||||||
if _NIGHTLY_VERSION[1].startswith('$Format'):
|
if _NIGHTLY_VERSION[1].startswith('$Format'):
|
||||||
return "Calibre-Web version: %s - unknown git-clone" % _STABLE_VERSION['version']
|
return "Calibre-Web version: %s - unknown git-clone" % _STABLE_VERSION['version'].replace("b", " Beta")
|
||||||
return "Calibre-Web version: %s -%s" % (_STABLE_VERSION['version'], _NIGHTLY_VERSION[1])
|
return "Calibre-Web version: %s -%s" % (_STABLE_VERSION['version'].replace("b", " Beta"), _NIGHTLY_VERSION[1])
|
||||||
|
|
||||||
|
|
||||||
class CliParameter(object):
|
class CliParameter(object):
|
||||||
|
@ -48,9 +48,11 @@ class CliParameter(object):
|
||||||
'works only in combination with keyfile')
|
'works only in combination with keyfile')
|
||||||
parser.add_argument('-k', metavar='path', help='path and name to SSL keyfile, e.g. /opt/test.key, '
|
parser.add_argument('-k', metavar='path', help='path and name to SSL keyfile, e.g. /opt/test.key, '
|
||||||
'works only in combination with certfile')
|
'works only in combination with certfile')
|
||||||
|
parser.add_argument('-o', metavar='path', help='path and name Calibre-Web logfile')
|
||||||
parser.add_argument('-v', '--version', action='version', help='Shows version number and exits Calibre-Web',
|
parser.add_argument('-v', '--version', action='version', help='Shows version number and exits Calibre-Web',
|
||||||
version=version_info())
|
version=version_info())
|
||||||
parser.add_argument('-i', metavar='ip-address', help='Server IP-Address to listen')
|
parser.add_argument('-i', metavar='ip-address', help='Server IP-Address to listen')
|
||||||
|
parser.add_argument('-m', action='store_true', help='Use Memory-backend as limiter backend, use this parameter in case of miss configured backend')
|
||||||
parser.add_argument('-s', metavar='user:pass',
|
parser.add_argument('-s', metavar='user:pass',
|
||||||
help='Sets specific username to new password and exits Calibre-Web')
|
help='Sets specific username to new password and exits Calibre-Web')
|
||||||
parser.add_argument('-f', action='store_true', help='Flag is depreciated and will be removed in next version')
|
parser.add_argument('-f', action='store_true', help='Flag is depreciated and will be removed in next version')
|
||||||
|
@ -60,6 +62,7 @@ class CliParameter(object):
|
||||||
parser.add_argument('-r', action='store_true', help='Enable public database reconnect route under /reconnect')
|
parser.add_argument('-r', action='store_true', help='Enable public database reconnect route under /reconnect')
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
self.logpath = args.o or ""
|
||||||
self.settings_path = args.p or os.path.join(_CONFIG_DIR, DEFAULT_SETTINGS_FILE)
|
self.settings_path = args.p or os.path.join(_CONFIG_DIR, DEFAULT_SETTINGS_FILE)
|
||||||
self.gd_path = args.g or os.path.join(_CONFIG_DIR, DEFAULT_GDRIVE_FILE)
|
self.gd_path = args.g or os.path.join(_CONFIG_DIR, DEFAULT_GDRIVE_FILE)
|
||||||
|
|
||||||
|
@ -96,6 +99,8 @@ class CliParameter(object):
|
||||||
if args.k == "":
|
if args.k == "":
|
||||||
self.keyfilepath = ""
|
self.keyfilepath = ""
|
||||||
|
|
||||||
|
# overwrite limiter backend
|
||||||
|
self.memory_backend = args.m or None
|
||||||
# dry run updater
|
# dry run updater
|
||||||
self.dry_run = args.d or None
|
self.dry_run = args.d or None
|
||||||
# enable reconnect endpoint for docker database reconnect
|
# enable reconnect endpoint for docker database reconnect
|
||||||
|
|
62
cps/comic.py
|
@ -36,6 +36,12 @@ try:
|
||||||
from comicapi import __version__ as comic_version
|
from comicapi import __version__ as comic_version
|
||||||
except ImportError:
|
except ImportError:
|
||||||
comic_version = ''
|
comic_version = ''
|
||||||
|
try:
|
||||||
|
from comicapi.comicarchive import load_archive_plugins
|
||||||
|
import comicapi.utils
|
||||||
|
comicapi.utils.add_rar_paths()
|
||||||
|
except ImportError:
|
||||||
|
load_archive_plugins = None
|
||||||
except (ImportError, LookupError) as e:
|
except (ImportError, LookupError) as e:
|
||||||
log.debug('Cannot import comicapi, extracting comic metadata will not work: %s', e)
|
log.debug('Cannot import comicapi, extracting comic metadata will not work: %s', e)
|
||||||
import zipfile
|
import zipfile
|
||||||
|
@ -46,6 +52,12 @@ except (ImportError, LookupError) as e:
|
||||||
except (ImportError, SyntaxError) as e:
|
except (ImportError, SyntaxError) as e:
|
||||||
log.debug('Cannot import rarfile, extracting cover files from rar files will not work: %s', e)
|
log.debug('Cannot import rarfile, extracting cover files from rar files will not work: %s', e)
|
||||||
use_rarfile = False
|
use_rarfile = False
|
||||||
|
try:
|
||||||
|
import py7zr
|
||||||
|
use_7zip = True
|
||||||
|
except (ImportError, SyntaxError) as e:
|
||||||
|
log.debug('Cannot import py7zr, extracting cover files from CB7 files will not work: %s', e)
|
||||||
|
use_7zip = False
|
||||||
use_comic_meta = False
|
use_comic_meta = False
|
||||||
|
|
||||||
|
|
||||||
|
@ -78,23 +90,40 @@ def _extract_cover_from_archive(original_file_extension, tmp_file_name, rar_exec
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in cover.COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = cf.read(name)
|
cover_data = cf.read([name])
|
||||||
break
|
break
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug('Rarfile failed with error: {}'.format(ex))
|
log.error('Rarfile failed with error: {}'.format(ex))
|
||||||
|
elif original_file_extension.upper() == '.CB7' and use_7zip:
|
||||||
|
cf = py7zr.SevenZipFile(tmp_file_name)
|
||||||
|
for name in cf.getnames():
|
||||||
|
ext = os.path.splitext(name)
|
||||||
|
if len(ext) > 1:
|
||||||
|
extension = ext[1].lower()
|
||||||
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
|
try:
|
||||||
|
cover_data = cf.read([name])[name].read()
|
||||||
|
except (py7zr.Bad7zFile, OSError) as ex:
|
||||||
|
log.error('7Zip file failed with error: {}'.format(ex))
|
||||||
|
break
|
||||||
return cover_data, extension
|
return cover_data, extension
|
||||||
|
|
||||||
|
|
||||||
def _extract_cover(tmp_file_name, original_file_extension, rar_executable):
|
def _extract_cover(tmp_file_name, original_file_extension, rar_executable):
|
||||||
cover_data = extension = None
|
cover_data = extension = None
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_name, rar_exe_path=rar_executable)
|
try:
|
||||||
for index, name in enumerate(archive.getPageNameList()):
|
archive = ComicArchive(tmp_file_name, rar_exe_path=rar_executable)
|
||||||
|
except TypeError:
|
||||||
|
archive = ComicArchive(tmp_file_name)
|
||||||
|
name_list = archive.getPageNameList if hasattr(archive, "getPageNameList") else archive.get_page_name_list
|
||||||
|
for index, name in enumerate(name_list()):
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in cover.COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = archive.getPage(index)
|
get_page = archive.getPage if hasattr(archive, "getPageNameList") else archive.get_page
|
||||||
|
cover_data = get_page(index)
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
cover_data, extension = _extract_cover_from_archive(original_file_extension, tmp_file_name, rar_executable)
|
cover_data, extension = _extract_cover_from_archive(original_file_extension, tmp_file_name, rar_executable)
|
||||||
|
@ -103,17 +132,26 @@ def _extract_cover(tmp_file_name, original_file_extension, rar_executable):
|
||||||
|
|
||||||
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rar_executable):
|
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rar_executable):
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_path, rar_exe_path=rar_executable)
|
try:
|
||||||
if archive.seemsToBeAComicArchive():
|
archive = ComicArchive(tmp_file_path, rar_exe_path=rar_executable)
|
||||||
if archive.hasMetadata(MetaDataStyle.CIX):
|
except TypeError:
|
||||||
|
load_archive_plugins(force=True, rar=rar_executable)
|
||||||
|
archive = ComicArchive(tmp_file_path)
|
||||||
|
if hasattr(archive, "seemsToBeAComicArchive"):
|
||||||
|
seems_archive = archive.seemsToBeAComicArchive
|
||||||
|
else:
|
||||||
|
seems_archive = archive.seems_to_be_a_comic_archive
|
||||||
|
if seems_archive():
|
||||||
|
has_metadata = archive.hasMetadata if hasattr(archive, "hasMetadata") else archive.has_metadata
|
||||||
|
if has_metadata(MetaDataStyle.CIX):
|
||||||
style = MetaDataStyle.CIX
|
style = MetaDataStyle.CIX
|
||||||
elif archive.hasMetadata(MetaDataStyle.CBI):
|
elif has_metadata(MetaDataStyle.CBI):
|
||||||
style = MetaDataStyle.CBI
|
style = MetaDataStyle.CBI
|
||||||
else:
|
else:
|
||||||
style = None
|
style = None
|
||||||
|
|
||||||
# if style is not None:
|
read_metadata = archive.readMetadata if hasattr(archive, "readMetadata") else archive.read_metadata
|
||||||
loaded_metadata = archive.readMetadata(style)
|
loaded_metadata = read_metadata(style)
|
||||||
|
|
||||||
lang = loaded_metadata.language or ""
|
lang = loaded_metadata.language or ""
|
||||||
loaded_metadata.language = isoLanguages.get_lang3(lang)
|
loaded_metadata.language = isoLanguages.get_lang3(lang)
|
||||||
|
@ -138,7 +176,7 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=original_file_name,
|
title=original_file_name,
|
||||||
author=u'Unknown',
|
author='Unknown',
|
||||||
cover=_extract_cover(tmp_file_path, original_file_extension, rar_executable),
|
cover=_extract_cover(tmp_file_path, original_file_extension, rar_executable),
|
||||||
description="",
|
description="",
|
||||||
tags="",
|
tags="",
|
||||||
|
|
|
@ -23,6 +23,10 @@ import json
|
||||||
from sqlalchemy import Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
from sqlalchemy import Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
||||||
from sqlalchemy.exc import OperationalError
|
from sqlalchemy.exc import OperationalError
|
||||||
from sqlalchemy.sql.expression import text
|
from sqlalchemy.sql.expression import text
|
||||||
|
from sqlalchemy import exists
|
||||||
|
from cryptography.fernet import Fernet
|
||||||
|
import cryptography.exceptions
|
||||||
|
from base64 import urlsafe_b64decode
|
||||||
try:
|
try:
|
||||||
# Compatibility with sqlalchemy 2.0
|
# Compatibility with sqlalchemy 2.0
|
||||||
from sqlalchemy.orm import declarative_base
|
from sqlalchemy.orm import declarative_base
|
||||||
|
@ -30,6 +34,7 @@ except ImportError:
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
from . import constants, logger
|
from . import constants, logger
|
||||||
|
from .subproc_wrapper import process_wait
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -56,7 +61,8 @@ class _Settings(_Base):
|
||||||
mail_port = Column(Integer, default=25)
|
mail_port = Column(Integer, default=25)
|
||||||
mail_use_ssl = Column(SmallInteger, default=0)
|
mail_use_ssl = Column(SmallInteger, default=0)
|
||||||
mail_login = Column(String, default='mail@example.com')
|
mail_login = Column(String, default='mail@example.com')
|
||||||
mail_password = Column(String, default='mypassword')
|
mail_password_e = Column(String)
|
||||||
|
mail_password = Column(String)
|
||||||
mail_from = Column(String, default='automailer <mail@example.com>')
|
mail_from = Column(String, default='automailer <mail@example.com>')
|
||||||
mail_size = Column(Integer, default=25*1024*1024)
|
mail_size = Column(Integer, default=25*1024*1024)
|
||||||
mail_server_type = Column(SmallInteger, default=0)
|
mail_server_type = Column(SmallInteger, default=0)
|
||||||
|
@ -64,24 +70,25 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_calibre_dir = Column(String)
|
config_calibre_dir = Column(String)
|
||||||
config_calibre_uuid = Column(String)
|
config_calibre_uuid = Column(String)
|
||||||
|
config_calibre_split = Column(Boolean, default=False)
|
||||||
|
config_calibre_split_dir = Column(String)
|
||||||
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||||
config_external_port = Column(Integer, default=constants.DEFAULT_PORT)
|
config_external_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||||
config_certfile = Column(String)
|
config_certfile = Column(String)
|
||||||
config_keyfile = Column(String)
|
config_keyfile = Column(String)
|
||||||
config_trustedhosts = Column(String, default='')
|
config_trustedhosts = Column(String, default='')
|
||||||
config_calibre_web_title = Column(String, default=u'Calibre-Web')
|
config_calibre_web_title = Column(String, default='Calibre-Web')
|
||||||
config_books_per_page = Column(Integer, default=60)
|
config_books_per_page = Column(Integer, default=60)
|
||||||
config_random_books = Column(Integer, default=4)
|
config_random_books = Column(Integer, default=4)
|
||||||
config_authors_max = Column(Integer, default=0)
|
config_authors_max = Column(Integer, default=0)
|
||||||
config_read_column = Column(Integer, default=0)
|
config_read_column = Column(Integer, default=0)
|
||||||
config_title_regex = Column(String, default=r'^(A|The|An|Der|Die|Das|Den|Ein|Eine|Einen|Dem|Des|Einem|Eines)\s+')
|
config_title_regex = Column(String, default=r'^(A|The|An|Der|Die|Das|Den|Ein|Eine|Einen|Dem|Des|Einem|Eines|Le|La|Les|L\'|Un|Une)\s+')
|
||||||
# config_mature_content_tags = Column(String, default='')
|
|
||||||
config_theme = Column(Integer, default=0)
|
config_theme = Column(Integer, default=0)
|
||||||
|
|
||||||
config_log_level = Column(SmallInteger, default=logger.DEFAULT_LOG_LEVEL)
|
config_log_level = Column(SmallInteger, default=logger.DEFAULT_LOG_LEVEL)
|
||||||
config_logfile = Column(String)
|
config_logfile = Column(String, default=logger.DEFAULT_LOG_FILE)
|
||||||
config_access_log = Column(SmallInteger, default=0)
|
config_access_log = Column(SmallInteger, default=0)
|
||||||
config_access_logfile = Column(String)
|
config_access_logfile = Column(String, default=logger.DEFAULT_ACCESS_LOG)
|
||||||
|
|
||||||
config_uploading = Column(SmallInteger, default=0)
|
config_uploading = Column(SmallInteger, default=0)
|
||||||
config_anonbrowse = Column(SmallInteger, default=0)
|
config_anonbrowse = Column(SmallInteger, default=0)
|
||||||
|
@ -107,7 +114,6 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_use_goodreads = Column(Boolean, default=False)
|
config_use_goodreads = Column(Boolean, default=False)
|
||||||
config_goodreads_api_key = Column(String)
|
config_goodreads_api_key = Column(String)
|
||||||
config_goodreads_api_secret = Column(String)
|
|
||||||
config_register_email = Column(Boolean, default=False)
|
config_register_email = Column(Boolean, default=False)
|
||||||
config_login_type = Column(Integer, default=0)
|
config_login_type = Column(Integer, default=0)
|
||||||
|
|
||||||
|
@ -117,7 +123,8 @@ class _Settings(_Base):
|
||||||
config_ldap_port = Column(SmallInteger, default=389)
|
config_ldap_port = Column(SmallInteger, default=389)
|
||||||
config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
|
config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
|
||||||
config_ldap_serv_username = Column(String, default='cn=admin,dc=example,dc=org')
|
config_ldap_serv_username = Column(String, default='cn=admin,dc=example,dc=org')
|
||||||
config_ldap_serv_password = Column(String, default="")
|
config_ldap_serv_password_e = Column(String)
|
||||||
|
config_ldap_serv_password = Column(String)
|
||||||
config_ldap_encryption = Column(SmallInteger, default=0)
|
config_ldap_encryption = Column(SmallInteger, default=0)
|
||||||
config_ldap_cacert_path = Column(String, default="")
|
config_ldap_cacert_path = Column(String, default="")
|
||||||
config_ldap_cert_path = Column(String, default="")
|
config_ldap_cert_path = Column(String, default="")
|
||||||
|
@ -132,10 +139,12 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_kepubifypath = Column(String, default=None)
|
config_kepubifypath = Column(String, default=None)
|
||||||
config_converterpath = Column(String, default=None)
|
config_converterpath = Column(String, default=None)
|
||||||
|
config_binariesdir = Column(String, default=None)
|
||||||
config_calibre = Column(String)
|
config_calibre = Column(String)
|
||||||
config_rarfile_location = Column(String, default=None)
|
config_rarfile_location = Column(String, default=None)
|
||||||
config_upload_formats = Column(String, default=','.join(constants.EXTENSIONS_UPLOAD))
|
config_upload_formats = Column(String, default=','.join(constants.EXTENSIONS_UPLOAD))
|
||||||
config_unicode_filename = Column(Boolean, default=False)
|
config_unicode_filename = Column(Boolean, default=False)
|
||||||
|
config_embed_metadata = Column(Boolean, default=True)
|
||||||
|
|
||||||
config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE)
|
config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE)
|
||||||
|
|
||||||
|
@ -147,29 +156,45 @@ class _Settings(_Base):
|
||||||
schedule_generate_book_covers = Column(Boolean, default=False)
|
schedule_generate_book_covers = Column(Boolean, default=False)
|
||||||
schedule_generate_series_covers = Column(Boolean, default=False)
|
schedule_generate_series_covers = Column(Boolean, default=False)
|
||||||
schedule_reconnect = Column(Boolean, default=False)
|
schedule_reconnect = Column(Boolean, default=False)
|
||||||
|
schedule_metadata_backup = Column(Boolean, default=False)
|
||||||
|
|
||||||
|
config_password_policy = Column(Boolean, default=True)
|
||||||
|
config_password_min_length = Column(Integer, default=8)
|
||||||
|
config_password_number = Column(Boolean, default=True)
|
||||||
|
config_password_lower = Column(Boolean, default=True)
|
||||||
|
config_password_upper = Column(Boolean, default=True)
|
||||||
|
config_password_character = Column(Boolean, default=True)
|
||||||
|
config_password_special = Column(Boolean, default=True)
|
||||||
|
config_session = Column(Integer, default=1)
|
||||||
|
config_ratelimiter = Column(Boolean, default=True)
|
||||||
|
config_limiter_uri = Column(String, default="")
|
||||||
|
config_limiter_options = Column(String, default="")
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return self.__class__.__name__
|
return self.__class__.__name__
|
||||||
|
|
||||||
|
|
||||||
# Class holds all application specific settings in calibre-web
|
# Class holds all application specific settings in calibre-web
|
||||||
class _ConfigSQL(object):
|
class ConfigSQL(object):
|
||||||
# pylint: disable=no-member
|
# pylint: disable=no-member
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
pass
|
self.__dict__["dirty"] = list()
|
||||||
|
|
||||||
def init_config(self, session, cli):
|
def init_config(self, session, secret_key, cli):
|
||||||
self._session = session
|
self._session = session
|
||||||
self._settings = None
|
self._settings = None
|
||||||
self.db_configured = None
|
self.db_configured = None
|
||||||
self.config_calibre_dir = None
|
self.config_calibre_dir = None
|
||||||
self.load()
|
self._fernet = Fernet(secret_key)
|
||||||
self.cli = cli
|
self.cli = cli
|
||||||
|
self.load()
|
||||||
|
|
||||||
change = False
|
change = False
|
||||||
if self.config_converterpath == None: # pylint: disable=access-member-before-definition
|
|
||||||
|
if self.config_binariesdir == None: # pylint: disable=access-member-before-definition
|
||||||
change = True
|
change = True
|
||||||
self.config_converterpath = autodetect_calibre_binary()
|
self.config_binariesdir = autodetect_calibre_binaries()
|
||||||
|
self.config_converterpath = autodetect_converter_binary(self.config_binariesdir)
|
||||||
|
|
||||||
if self.config_kepubifypath == None: # pylint: disable=access-member-before-definition
|
if self.config_kepubifypath == None: # pylint: disable=access-member-before-definition
|
||||||
change = True
|
change = True
|
||||||
|
@ -293,10 +318,10 @@ class _ConfigSQL(object):
|
||||||
setattr(self, field, new_value)
|
setattr(self, field, new_value)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def toDict(self):
|
def to_dict(self):
|
||||||
storage = {}
|
storage = {}
|
||||||
for k, v in self.__dict__.items():
|
for k, v in self.__dict__.items():
|
||||||
if k[0] != '_' and not k.endswith("password") and not k.endswith("secret") and not k == "cli":
|
if k[0] != '_' and not k.endswith("_e") and not k == "cli":
|
||||||
storage[k] = v
|
storage[k] = v
|
||||||
return storage
|
return storage
|
||||||
|
|
||||||
|
@ -310,7 +335,13 @@ class _ConfigSQL(object):
|
||||||
column = s.__class__.__dict__.get(k)
|
column = s.__class__.__dict__.get(k)
|
||||||
if column.default is not None:
|
if column.default is not None:
|
||||||
v = column.default.arg
|
v = column.default.arg
|
||||||
setattr(self, k, v)
|
if k.endswith("_e") and v is not None:
|
||||||
|
try:
|
||||||
|
setattr(self, k, self._fernet.decrypt(v).decode())
|
||||||
|
except cryptography.fernet.InvalidToken:
|
||||||
|
setattr(self, k, "")
|
||||||
|
else:
|
||||||
|
setattr(self, k, v)
|
||||||
|
|
||||||
have_metadata_db = bool(self.config_calibre_dir)
|
have_metadata_db = bool(self.config_calibre_dir)
|
||||||
if have_metadata_db:
|
if have_metadata_db:
|
||||||
|
@ -318,30 +349,37 @@ class _ConfigSQL(object):
|
||||||
have_metadata_db = os.path.isfile(db_file)
|
have_metadata_db = os.path.isfile(db_file)
|
||||||
self.db_configured = have_metadata_db
|
self.db_configured = have_metadata_db
|
||||||
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip().lower() for x in self.config_upload_formats.split(',')]
|
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip().lower() for x in self.config_upload_formats.split(',')]
|
||||||
|
from . import cli_param
|
||||||
if os.environ.get('FLASK_DEBUG'):
|
if os.environ.get('FLASK_DEBUG'):
|
||||||
logfile = logger.setup(logger.LOG_TO_STDOUT, logger.logging.DEBUG)
|
logfile = logger.setup(logger.LOG_TO_STDOUT, logger.logging.DEBUG)
|
||||||
else:
|
else:
|
||||||
# pylint: disable=access-member-before-definition
|
# pylint: disable=access-member-before-definition
|
||||||
logfile = logger.setup(self.config_logfile, self.config_log_level)
|
logfile = logger.setup(cli_param.logpath or self.config_logfile, self.config_log_level)
|
||||||
if logfile != self.config_logfile:
|
if logfile != os.path.abspath(self.config_logfile):
|
||||||
log.warning("Log path %s not valid, falling back to default", self.config_logfile)
|
if logfile != os.path.abspath(cli_param.logpath):
|
||||||
|
log.warning("Log path %s not valid, falling back to default", self.config_logfile)
|
||||||
self.config_logfile = logfile
|
self.config_logfile = logfile
|
||||||
|
s.config_logfile = logfile
|
||||||
self._session.merge(s)
|
self._session.merge(s)
|
||||||
try:
|
try:
|
||||||
self._session.commit()
|
self._session.commit()
|
||||||
except OperationalError as e:
|
except OperationalError as e:
|
||||||
log.error('Database error: %s', e)
|
log.error('Database error: %s', e)
|
||||||
self._session.rollback()
|
self._session.rollback()
|
||||||
|
self.__dict__["dirty"] = list()
|
||||||
|
|
||||||
def save(self):
|
def save(self):
|
||||||
"""Apply all configuration values to the underlying storage."""
|
"""Apply all configuration values to the underlying storage."""
|
||||||
s = self._read_from_storage() # type: _Settings
|
s = self._read_from_storage() # type: _Settings
|
||||||
|
|
||||||
for k, v in self.__dict__.items():
|
for k in self.dirty:
|
||||||
if k[0] == '_':
|
if k[0] == '_':
|
||||||
continue
|
continue
|
||||||
if hasattr(s, k):
|
if hasattr(s, k):
|
||||||
setattr(s, k, v)
|
if k.endswith("_e"):
|
||||||
|
setattr(s, k, self._fernet.encrypt(self.__dict__[k].encode()))
|
||||||
|
else:
|
||||||
|
setattr(s, k, self.__dict__[k])
|
||||||
|
|
||||||
log.debug("_ConfigSQL updating storage")
|
log.debug("_ConfigSQL updating storage")
|
||||||
self._session.merge(s)
|
self._session.merge(s)
|
||||||
|
@ -357,9 +395,11 @@ class _ConfigSQL(object):
|
||||||
log.error(error)
|
log.error(error)
|
||||||
log.warning("invalidating configuration")
|
log.warning("invalidating configuration")
|
||||||
self.db_configured = False
|
self.db_configured = False
|
||||||
# self.config_calibre_dir = None
|
|
||||||
self.save()
|
self.save()
|
||||||
|
|
||||||
|
def get_book_path(self):
|
||||||
|
return self.config_calibre_split_dir if self.config_calibre_split_dir else self.config_calibre_dir
|
||||||
|
|
||||||
def store_calibre_uuid(self, calibre_db, Library_table):
|
def store_calibre_uuid(self, calibre_db, Library_table):
|
||||||
try:
|
try:
|
||||||
calibre_uuid = calibre_db.session.query(Library_table).one_or_none()
|
calibre_uuid = calibre_db.session.query(Library_table).one_or_none()
|
||||||
|
@ -369,8 +409,34 @@ class _ConfigSQL(object):
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
def __setattr__(self, attr_name, attr_value):
|
||||||
|
super().__setattr__(attr_name, attr_value)
|
||||||
|
self.__dict__["dirty"].append(attr_name)
|
||||||
|
|
||||||
def _migrate_table(session, orm_class):
|
|
||||||
|
def _encrypt_fields(session, secret_key):
|
||||||
|
try:
|
||||||
|
session.query(exists().where(_Settings.mail_password_e)).scalar()
|
||||||
|
except OperationalError:
|
||||||
|
with session.bind.connect() as conn:
|
||||||
|
conn.execute(text("ALTER TABLE settings ADD column 'mail_password_e' String"))
|
||||||
|
conn.execute(text("ALTER TABLE settings ADD column 'config_ldap_serv_password_e' String"))
|
||||||
|
session.commit()
|
||||||
|
crypter = Fernet(secret_key)
|
||||||
|
settings = session.query(_Settings.mail_password, _Settings.config_ldap_serv_password).first()
|
||||||
|
if settings.mail_password:
|
||||||
|
session.query(_Settings).update(
|
||||||
|
{_Settings.mail_password_e: crypter.encrypt(settings.mail_password.encode())})
|
||||||
|
if settings.config_ldap_serv_password:
|
||||||
|
session.query(_Settings).update(
|
||||||
|
{_Settings.config_ldap_serv_password_e:
|
||||||
|
crypter.encrypt(settings.config_ldap_serv_password.encode())})
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
|
||||||
|
def _migrate_table(session, orm_class, secret_key=None):
|
||||||
|
if secret_key:
|
||||||
|
_encrypt_fields(session, secret_key)
|
||||||
changed = False
|
changed = False
|
||||||
|
|
||||||
for column_name, column in orm_class.__dict__.items():
|
for column_name, column in orm_class.__dict__.items():
|
||||||
|
@ -408,17 +474,35 @@ def _migrate_table(session, orm_class):
|
||||||
session.rollback()
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
def autodetect_calibre_binary():
|
def autodetect_calibre_binaries():
|
||||||
if sys.platform == "win32":
|
if sys.platform == "win32":
|
||||||
calibre_path = ["C:\\program files\\calibre\\ebook-convert.exe",
|
calibre_path = ["C:\\program files\\calibre\\",
|
||||||
"C:\\program files(x86)\\calibre\\ebook-convert.exe",
|
"C:\\program files(x86)\\calibre\\",
|
||||||
"C:\\program files(x86)\\calibre2\\ebook-convert.exe",
|
"C:\\program files(x86)\\calibre2\\",
|
||||||
"C:\\program files\\calibre2\\ebook-convert.exe"]
|
"C:\\program files\\calibre2\\"]
|
||||||
else:
|
else:
|
||||||
calibre_path = ["/opt/calibre/ebook-convert"]
|
calibre_path = ["/opt/calibre/"]
|
||||||
for element in calibre_path:
|
for element in calibre_path:
|
||||||
if os.path.isfile(element) and os.access(element, os.X_OK):
|
supported_binary_paths = [os.path.join(element, binary)
|
||||||
return element
|
for binary in constants.SUPPORTED_CALIBRE_BINARIES.values()]
|
||||||
|
if all(os.path.isfile(binary_path) and os.access(binary_path, os.X_OK)
|
||||||
|
for binary_path in supported_binary_paths):
|
||||||
|
values = [process_wait([binary_path, "--version"],
|
||||||
|
pattern=r'\(calibre (.*)\)') for binary_path in supported_binary_paths]
|
||||||
|
if all(values):
|
||||||
|
version = values[0].group(1)
|
||||||
|
log.debug("calibre version %s", version)
|
||||||
|
return element
|
||||||
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
def autodetect_converter_binary(calibre_path):
|
||||||
|
if sys.platform == "win32":
|
||||||
|
converter_path = os.path.join(calibre_path, "ebook-convert.exe")
|
||||||
|
else:
|
||||||
|
converter_path = os.path.join(calibre_path, "ebook-convert")
|
||||||
|
if calibre_path and os.path.isfile(converter_path) and os.access(converter_path, os.X_OK):
|
||||||
|
return converter_path
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
@ -446,22 +530,18 @@ def autodetect_kepubify_binary():
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def _migrate_database(session):
|
def _migrate_database(session, secret_key):
|
||||||
# make sure the table is created, if it does not exist
|
# make sure the table is created, if it does not exist
|
||||||
_Base.metadata.create_all(session.bind)
|
_Base.metadata.create_all(session.bind)
|
||||||
_migrate_table(session, _Settings)
|
_migrate_table(session, _Settings, secret_key)
|
||||||
_migrate_table(session, _Flask_Settings)
|
_migrate_table(session, _Flask_Settings)
|
||||||
|
|
||||||
|
|
||||||
def load_configuration(conf, session, cli):
|
def load_configuration(session, secret_key):
|
||||||
_migrate_database(session)
|
_migrate_database(session, secret_key)
|
||||||
|
|
||||||
if not session.query(_Settings).count():
|
if not session.query(_Settings).count():
|
||||||
session.add(_Settings())
|
session.add(_Settings())
|
||||||
session.commit()
|
session.commit()
|
||||||
# conf = _ConfigSQL()
|
|
||||||
conf.init_config(session, cli)
|
|
||||||
# return conf
|
|
||||||
|
|
||||||
|
|
||||||
def get_flask_session_key(_session):
|
def get_flask_session_key(_session):
|
||||||
|
@ -471,3 +551,25 @@ def get_flask_session_key(_session):
|
||||||
_session.add(flask_settings)
|
_session.add(flask_settings)
|
||||||
_session.commit()
|
_session.commit()
|
||||||
return flask_settings.flask_session_key
|
return flask_settings.flask_session_key
|
||||||
|
|
||||||
|
|
||||||
|
def get_encryption_key(key_path):
|
||||||
|
key_file = os.path.join(key_path, ".key")
|
||||||
|
generate = True
|
||||||
|
error = ""
|
||||||
|
if os.path.exists(key_file) and os.path.getsize(key_file) > 32:
|
||||||
|
with open(key_file, "rb") as f:
|
||||||
|
key = f.read()
|
||||||
|
try:
|
||||||
|
urlsafe_b64decode(key)
|
||||||
|
generate = False
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
if generate:
|
||||||
|
key = Fernet.generate_key()
|
||||||
|
try:
|
||||||
|
with open(key_file, "wb") as f:
|
||||||
|
f.write(key)
|
||||||
|
except PermissionError as e:
|
||||||
|
error = e
|
||||||
|
return key, error
|
||||||
|
|
|
@ -34,6 +34,8 @@ UPDATER_AVAILABLE = True
|
||||||
|
|
||||||
# Base dir is parent of current file, necessary if called from different folder
|
# Base dir is parent of current file, necessary if called from different folder
|
||||||
BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(os.path.abspath(__file__)), os.pardir))
|
BASE_DIR = os.path.abspath(os.path.join(os.path.dirname(os.path.abspath(__file__)), os.pardir))
|
||||||
|
# if executable file the files should be placed in the parent dir (parallel to the exe file)
|
||||||
|
|
||||||
STATIC_DIR = os.path.join(BASE_DIR, 'cps', 'static')
|
STATIC_DIR = os.path.join(BASE_DIR, 'cps', 'static')
|
||||||
TEMPLATES_DIR = os.path.join(BASE_DIR, 'cps', 'templates')
|
TEMPLATES_DIR = os.path.join(BASE_DIR, 'cps', 'templates')
|
||||||
TRANSLATIONS_DIR = os.path.join(BASE_DIR, 'cps', 'translations')
|
TRANSLATIONS_DIR = os.path.join(BASE_DIR, 'cps', 'translations')
|
||||||
|
@ -49,6 +51,9 @@ if HOME_CONFIG:
|
||||||
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', home_dir)
|
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', home_dir)
|
||||||
else:
|
else:
|
||||||
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
||||||
|
if getattr(sys, 'frozen', False):
|
||||||
|
CONFIG_DIR = os.path.abspath(os.path.join(CONFIG_DIR, os.pardir))
|
||||||
|
|
||||||
|
|
||||||
DEFAULT_SETTINGS_FILE = "app.db"
|
DEFAULT_SETTINGS_FILE = "app.db"
|
||||||
DEFAULT_GDRIVE_FILE = "gdrive.db"
|
DEFAULT_GDRIVE_FILE = "gdrive.db"
|
||||||
|
@ -144,13 +149,18 @@ del env_CALIBRE_PORT
|
||||||
|
|
||||||
EXTENSIONS_AUDIO = {'mp3', 'mp4', 'ogg', 'opus', 'wav', 'flac', 'm4a', 'm4b'}
|
EXTENSIONS_AUDIO = {'mp3', 'mp4', 'ogg', 'opus', 'wav', 'flac', 'm4a', 'm4b'}
|
||||||
EXTENSIONS_CONVERT_FROM = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2', 'lit', 'lrf',
|
EXTENSIONS_CONVERT_FROM = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2', 'lit', 'lrf',
|
||||||
'txt', 'htmlz', 'rtf', 'odt', 'cbz', 'cbr']
|
'txt', 'htmlz', 'rtf', 'odt', 'cbz', 'cbr', 'prc']
|
||||||
EXTENSIONS_CONVERT_TO = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2',
|
EXTENSIONS_CONVERT_TO = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2',
|
||||||
'lit', 'lrf', 'txt', 'htmlz', 'rtf', 'odt']
|
'lit', 'lrf', 'txt', 'htmlz', 'rtf', 'odt']
|
||||||
EXTENSIONS_UPLOAD = {'txt', 'pdf', 'epub', 'kepub', 'mobi', 'azw', 'azw3', 'cbr', 'cbz', 'cbt', 'djvu',
|
EXTENSIONS_UPLOAD = {'txt', 'pdf', 'epub', 'kepub', 'mobi', 'azw', 'azw3', 'cbr', 'cbz', 'cbt', 'cb7', 'djvu', 'djv',
|
||||||
'prc', 'doc', 'docx', 'fb2', 'html', 'rtf', 'lit', 'odt', 'mp3', 'mp4', 'ogg',
|
'prc', 'doc', 'docx', 'fb2', 'html', 'rtf', 'lit', 'odt', 'mp3', 'mp4', 'ogg',
|
||||||
'opus', 'wav', 'flac', 'm4a', 'm4b'}
|
'opus', 'wav', 'flac', 'm4a', 'm4b'}
|
||||||
|
|
||||||
|
_extension = ""
|
||||||
|
if sys.platform == "win32":
|
||||||
|
_extension = ".exe"
|
||||||
|
SUPPORTED_CALIBRE_BINARIES = {binary:binary + _extension for binary in ["ebook-convert", "calibredb"]}
|
||||||
|
|
||||||
|
|
||||||
def has_flag(value, bit_flag):
|
def has_flag(value, bit_flag):
|
||||||
return bit_flag == (bit_flag & (value or 0))
|
return bit_flag == (bit_flag & (value or 0))
|
||||||
|
@ -163,13 +173,12 @@ def selected_roles(dictionary):
|
||||||
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
||||||
'series_id, languages, publisher, pubdate, identifiers')
|
'series_id, languages, publisher, pubdate, identifiers')
|
||||||
|
|
||||||
STABLE_VERSION = {'version': '0.6.19'}
|
# python build process likes to have x.y.zbw -> b for beta and w a counting number
|
||||||
|
STABLE_VERSION = {'version': '0.6.22b'}
|
||||||
|
|
||||||
NIGHTLY_VERSION = dict()
|
NIGHTLY_VERSION = dict()
|
||||||
NIGHTLY_VERSION[0] = '$Format:%H$'
|
NIGHTLY_VERSION[0] = '$Format:%H$'
|
||||||
NIGHTLY_VERSION[1] = '$Format:%cI$'
|
NIGHTLY_VERSION[1] = '$Format:%cI$'
|
||||||
# NIGHTLY_VERSION[0] = 'bb7d2c6273ae4560e83950d36d64533343623a57'
|
|
||||||
# NIGHTLY_VERSION[1] = '2018-09-09T10:13:08+02:00'
|
|
||||||
|
|
||||||
# CACHE
|
# CACHE
|
||||||
CACHE_TYPE_THUMBNAILS = 'thumbnails'
|
CACHE_TYPE_THUMBNAILS = 'thumbnails'
|
||||||
|
|
167
cps/db.py
|
@ -111,66 +111,77 @@ class Identifiers(Base):
|
||||||
def format_type(self):
|
def format_type(self):
|
||||||
format_type = self.type.lower()
|
format_type = self.type.lower()
|
||||||
if format_type == 'amazon':
|
if format_type == 'amazon':
|
||||||
return u"Amazon"
|
return "Amazon"
|
||||||
elif format_type.startswith("amazon_"):
|
elif format_type.startswith("amazon_"):
|
||||||
return u"Amazon.{0}".format(format_type[7:])
|
return "Amazon.{0}".format(format_type[7:])
|
||||||
elif format_type == "isbn":
|
elif format_type == "isbn":
|
||||||
return u"ISBN"
|
return "ISBN"
|
||||||
elif format_type == "doi":
|
elif format_type == "doi":
|
||||||
return u"DOI"
|
return "DOI"
|
||||||
elif format_type == "douban":
|
elif format_type == "douban":
|
||||||
return u"Douban"
|
return "Douban"
|
||||||
elif format_type == "goodreads":
|
elif format_type == "goodreads":
|
||||||
return u"Goodreads"
|
return "Goodreads"
|
||||||
elif format_type == "babelio":
|
elif format_type == "babelio":
|
||||||
return u"Babelio"
|
return "Babelio"
|
||||||
elif format_type == "google":
|
elif format_type == "google":
|
||||||
return u"Google Books"
|
return "Google Books"
|
||||||
elif format_type == "kobo":
|
elif format_type == "kobo":
|
||||||
return u"Kobo"
|
return "Kobo"
|
||||||
|
elif format_type == "barnesnoble":
|
||||||
|
return "Barnes & Noble"
|
||||||
elif format_type == "litres":
|
elif format_type == "litres":
|
||||||
return u"ЛитРес"
|
return "ЛитРес"
|
||||||
elif format_type == "issn":
|
elif format_type == "issn":
|
||||||
return u"ISSN"
|
return "ISSN"
|
||||||
elif format_type == "isfdb":
|
elif format_type == "isfdb":
|
||||||
return u"ISFDB"
|
return "ISFDB"
|
||||||
if format_type == "lubimyczytac":
|
if format_type == "lubimyczytac":
|
||||||
return u"Lubimyczytac"
|
return "Lubimyczytac"
|
||||||
|
if format_type == "databazeknih":
|
||||||
|
return "Databáze knih"
|
||||||
else:
|
else:
|
||||||
return self.type
|
return self.type
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
format_type = self.type.lower()
|
format_type = self.type.lower()
|
||||||
if format_type == "amazon" or format_type == "asin":
|
if format_type == "amazon" or format_type == "asin":
|
||||||
return u"https://amazon.com/dp/{0}".format(self.val)
|
return "https://amazon.com/dp/{0}".format(self.val)
|
||||||
elif format_type.startswith('amazon_'):
|
elif format_type.startswith('amazon_'):
|
||||||
return u"https://amazon.{0}/dp/{1}".format(format_type[7:], self.val)
|
return "https://amazon.{0}/dp/{1}".format(format_type[7:], self.val)
|
||||||
elif format_type == "isbn":
|
elif format_type == "isbn":
|
||||||
return u"https://www.worldcat.org/isbn/{0}".format(self.val)
|
return "https://www.worldcat.org/isbn/{0}".format(self.val)
|
||||||
elif format_type == "doi":
|
elif format_type == "doi":
|
||||||
return u"https://dx.doi.org/{0}".format(self.val)
|
return "https://dx.doi.org/{0}".format(self.val)
|
||||||
elif format_type == "goodreads":
|
elif format_type == "goodreads":
|
||||||
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
return "https://www.goodreads.com/book/show/{0}".format(self.val)
|
||||||
elif format_type == "babelio":
|
elif format_type == "babelio":
|
||||||
return u"https://www.babelio.com/livres/titre/{0}".format(self.val)
|
return "https://www.babelio.com/livres/titre/{0}".format(self.val)
|
||||||
elif format_type == "douban":
|
elif format_type == "douban":
|
||||||
return u"https://book.douban.com/subject/{0}".format(self.val)
|
return "https://book.douban.com/subject/{0}".format(self.val)
|
||||||
elif format_type == "google":
|
elif format_type == "google":
|
||||||
return u"https://books.google.com/books?id={0}".format(self.val)
|
return "https://books.google.com/books?id={0}".format(self.val)
|
||||||
elif format_type == "kobo":
|
elif format_type == "kobo":
|
||||||
return u"https://www.kobo.com/ebook/{0}".format(self.val)
|
return "https://www.kobo.com/ebook/{0}".format(self.val)
|
||||||
|
elif format_type == "barnesnoble":
|
||||||
|
return "https://www.barnesandnoble.com/w/{0}".format(self.val)
|
||||||
elif format_type == "lubimyczytac":
|
elif format_type == "lubimyczytac":
|
||||||
return u"https://lubimyczytac.pl/ksiazka/{0}/ksiazka".format(self.val)
|
return "https://lubimyczytac.pl/ksiazka/{0}/ksiazka".format(self.val)
|
||||||
elif format_type == "litres":
|
elif format_type == "litres":
|
||||||
return u"https://www.litres.ru/{0}".format(self.val)
|
return "https://www.litres.ru/{0}".format(self.val)
|
||||||
elif format_type == "issn":
|
elif format_type == "issn":
|
||||||
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
return "https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
||||||
elif format_type == "isfdb":
|
elif format_type == "isfdb":
|
||||||
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
return "http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
||||||
|
elif format_type == "databazeknih":
|
||||||
|
return "https://www.databazeknih.cz/knihy/{0}".format(self.val)
|
||||||
elif self.val.lower().startswith("javascript:"):
|
elif self.val.lower().startswith("javascript:"):
|
||||||
return quote(self.val)
|
return quote(self.val)
|
||||||
|
elif self.val.lower().startswith("data:"):
|
||||||
|
link , __, __ = str.partition(self.val, ",")
|
||||||
|
return link
|
||||||
else:
|
else:
|
||||||
return u"{0}".format(self.val)
|
return "{0}".format(self.val)
|
||||||
|
|
||||||
|
|
||||||
class Comments(Base):
|
class Comments(Base):
|
||||||
|
@ -188,7 +199,7 @@ class Comments(Base):
|
||||||
return self.text
|
return self.text
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Comments({0})>".format(self.text)
|
return "<Comments({0})>".format(self.text)
|
||||||
|
|
||||||
|
|
||||||
class Tags(Base):
|
class Tags(Base):
|
||||||
|
@ -203,8 +214,11 @@ class Tags(Base):
|
||||||
def get(self):
|
def get(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.name == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Tags('{0})>".format(self.name)
|
return "<Tags('{0})>".format(self.name)
|
||||||
|
|
||||||
|
|
||||||
class Authors(Base):
|
class Authors(Base):
|
||||||
|
@ -215,7 +229,7 @@ class Authors(Base):
|
||||||
sort = Column(String(collation='NOCASE'))
|
sort = Column(String(collation='NOCASE'))
|
||||||
link = Column(String, nullable=False, default="")
|
link = Column(String, nullable=False, default="")
|
||||||
|
|
||||||
def __init__(self, name, sort, link):
|
def __init__(self, name, sort, link=""):
|
||||||
self.name = name
|
self.name = name
|
||||||
self.sort = sort
|
self.sort = sort
|
||||||
self.link = link
|
self.link = link
|
||||||
|
@ -223,8 +237,11 @@ class Authors(Base):
|
||||||
def get(self):
|
def get(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.name == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Authors('{0},{1}{2}')>".format(self.name, self.sort, self.link)
|
return "<Authors('{0},{1}{2}')>".format(self.name, self.sort, self.link)
|
||||||
|
|
||||||
|
|
||||||
class Series(Base):
|
class Series(Base):
|
||||||
|
@ -241,8 +258,11 @@ class Series(Base):
|
||||||
def get(self):
|
def get(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.name == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Series('{0},{1}')>".format(self.name, self.sort)
|
return "<Series('{0},{1}')>".format(self.name, self.sort)
|
||||||
|
|
||||||
|
|
||||||
class Ratings(Base):
|
class Ratings(Base):
|
||||||
|
@ -257,8 +277,11 @@ class Ratings(Base):
|
||||||
def get(self):
|
def get(self):
|
||||||
return self.rating
|
return self.rating
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.rating == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Ratings('{0}')>".format(self.rating)
|
return "<Ratings('{0}')>".format(self.rating)
|
||||||
|
|
||||||
|
|
||||||
class Languages(Base):
|
class Languages(Base):
|
||||||
|
@ -271,13 +294,16 @@ class Languages(Base):
|
||||||
self.lang_code = lang_code
|
self.lang_code = lang_code
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
if self.language_name:
|
if hasattr(self, "language_name"):
|
||||||
return self.language_name
|
return self.language_name
|
||||||
else:
|
else:
|
||||||
return self.lang_code
|
return self.lang_code
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.lang_code == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Languages('{0}')>".format(self.lang_code)
|
return "<Languages('{0}')>".format(self.lang_code)
|
||||||
|
|
||||||
|
|
||||||
class Publishers(Base):
|
class Publishers(Base):
|
||||||
|
@ -294,8 +320,11 @@ class Publishers(Base):
|
||||||
def get(self):
|
def get(self):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self.name == other
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Publishers('{0},{1}')>".format(self.name, self.sort)
|
return "<Publishers('{0},{1}')>".format(self.name, self.sort)
|
||||||
|
|
||||||
|
|
||||||
class Data(Base):
|
class Data(Base):
|
||||||
|
@ -319,7 +348,16 @@ class Data(Base):
|
||||||
return self.name
|
return self.name
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Data('{0},{1}{2}{3}')>".format(self.book, self.format, self.uncompressed_size, self.name)
|
return "<Data('{0},{1}{2}{3}')>".format(self.book, self.format, self.uncompressed_size, self.name)
|
||||||
|
|
||||||
|
|
||||||
|
class Metadata_Dirtied(Base):
|
||||||
|
__tablename__ = 'metadata_dirtied'
|
||||||
|
id = Column(Integer, primary_key=True, autoincrement=True)
|
||||||
|
book = Column(Integer, ForeignKey('books.id'), nullable=False, unique=True)
|
||||||
|
|
||||||
|
def __init__(self, book):
|
||||||
|
self.book = book
|
||||||
|
|
||||||
|
|
||||||
class Books(Base):
|
class Books(Base):
|
||||||
|
@ -364,7 +402,7 @@ class Books(Base):
|
||||||
self.has_cover = (has_cover != None)
|
self.has_cover = (has_cover != None)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort,
|
return "<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort,
|
||||||
self.timestamp, self.pubdate, self.series_index,
|
self.timestamp, self.pubdate, self.series_index,
|
||||||
self.last_modified, self.path, self.has_cover)
|
self.last_modified, self.path, self.has_cover)
|
||||||
|
|
||||||
|
@ -390,6 +428,33 @@ class CustomColumns(Base):
|
||||||
display_dict = json.loads(self.display)
|
display_dict = json.loads(self.display)
|
||||||
return display_dict
|
return display_dict
|
||||||
|
|
||||||
|
def to_json(self, value, extra, sequence):
|
||||||
|
content = dict()
|
||||||
|
content['table'] = "custom_column_" + str(self.id)
|
||||||
|
content['column'] = "value"
|
||||||
|
content['datatype'] = self.datatype
|
||||||
|
content['is_multiple'] = None if not self.is_multiple else "|"
|
||||||
|
content['kind'] = "field"
|
||||||
|
content['name'] = self.name
|
||||||
|
content['search_terms'] = ['#' + self.label]
|
||||||
|
content['label'] = self.label
|
||||||
|
content['colnum'] = self.id
|
||||||
|
content['display'] = self.get_display_dict()
|
||||||
|
content['is_custom'] = True
|
||||||
|
content['is_category'] = self.datatype in ['text', 'rating', 'enumeration', 'series']
|
||||||
|
content['link_column'] = "value"
|
||||||
|
content['category_sort'] = "value"
|
||||||
|
content['is_csp'] = False
|
||||||
|
content['is_editable'] = self.editable
|
||||||
|
content['rec_index'] = sequence + 22 # toDo why ??
|
||||||
|
if isinstance(value, datetime):
|
||||||
|
content['#value#'] = {"__class__": "datetime.datetime", "__value__": value.strftime("%Y-%m-%dT%H:%M:%S+00:00")}
|
||||||
|
else:
|
||||||
|
content['#value#'] = value
|
||||||
|
content['#extra#'] = extra
|
||||||
|
content['is_multiple2'] = {} if not self.is_multiple else {"cache_to_list": "|", "ui_to_list": ",", "list_to_ui": ", "}
|
||||||
|
return json.dumps(content, ensure_ascii=False)
|
||||||
|
|
||||||
|
|
||||||
class AlchemyEncoder(json.JSONEncoder):
|
class AlchemyEncoder(json.JSONEncoder):
|
||||||
|
|
||||||
|
@ -602,7 +667,7 @@ class CalibreDB:
|
||||||
|
|
||||||
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
||||||
autoflush=True,
|
autoflush=True,
|
||||||
bind=cls.engine))
|
bind=cls.engine, future=True))
|
||||||
for inst in cls.instances:
|
for inst in cls.instances:
|
||||||
inst.init_session()
|
inst.init_session()
|
||||||
|
|
||||||
|
@ -641,6 +706,18 @@ class CalibreDB:
|
||||||
def get_book_format(self, book_id, file_format):
|
def get_book_format(self, book_id, file_format):
|
||||||
return self.session.query(Data).filter(Data.book == book_id).filter(Data.format == file_format).first()
|
return self.session.query(Data).filter(Data.book == book_id).filter(Data.format == file_format).first()
|
||||||
|
|
||||||
|
def set_metadata_dirty(self, book_id):
|
||||||
|
if not self.session.query(Metadata_Dirtied).filter(Metadata_Dirtied.book == book_id).one_or_none():
|
||||||
|
self.session.add(Metadata_Dirtied(book_id))
|
||||||
|
|
||||||
|
def delete_dirty_metadata(self, book_id):
|
||||||
|
try:
|
||||||
|
self.session.query(Metadata_Dirtied).filter(Metadata_Dirtied.book == book_id).delete()
|
||||||
|
self.session.commit()
|
||||||
|
except (OperationalError) as e:
|
||||||
|
self.session.rollback()
|
||||||
|
log.error("Database error: {}".format(e))
|
||||||
|
|
||||||
# Language and content filters for displaying in the UI
|
# Language and content filters for displaying in the UI
|
||||||
def common_filters(self, allow_show_archived=False, return_all_languages=False):
|
def common_filters(self, allow_show_archived=False, return_all_languages=False):
|
||||||
if not allow_show_archived:
|
if not allow_show_archived:
|
||||||
|
@ -766,8 +843,7 @@ class CalibreDB:
|
||||||
entries = list()
|
entries = list()
|
||||||
pagination = list()
|
pagination = list()
|
||||||
try:
|
try:
|
||||||
pagination = Pagination(page, pagesize,
|
pagination = Pagination(page, pagesize, query.count())
|
||||||
len(query.all()))
|
|
||||||
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
|
@ -777,8 +853,6 @@ class CalibreDB:
|
||||||
|
|
||||||
# Orders all Authors in the list according to authors sort
|
# Orders all Authors in the list according to authors sort
|
||||||
def order_authors(self, entries, list_return=False, combined=False):
|
def order_authors(self, entries, list_return=False, combined=False):
|
||||||
# entries_copy = copy.deepcopy(entries)
|
|
||||||
# entries_copy =[]
|
|
||||||
for entry in entries:
|
for entry in entries:
|
||||||
if combined:
|
if combined:
|
||||||
sort_authors = entry.Books.author_sort.split('&')
|
sort_authors = entry.Books.author_sort.split('&')
|
||||||
|
@ -943,7 +1017,12 @@ class CalibreDB:
|
||||||
title = title[len(prep):] + ', ' + prep
|
title = title[len(prep):] + ', ' + prep
|
||||||
return title.strip()
|
return title.strip()
|
||||||
|
|
||||||
conn = conn or self.session.connection().connection.connection
|
try:
|
||||||
|
# sqlalchemy <1.4.24
|
||||||
|
conn = conn or self.session.connection().connection.driver_connection
|
||||||
|
except AttributeError:
|
||||||
|
# sqlalchemy >1.4.24 and sqlalchemy 2.0
|
||||||
|
conn = conn or self.session.connection().connection.connection
|
||||||
try:
|
try:
|
||||||
conn.create_function("title_sort", 1, _title_sort)
|
conn.create_function("title_sort", 1, _title_sort)
|
||||||
except sqliteOperationalError:
|
except sqliteOperationalError:
|
||||||
|
|
|
@ -65,7 +65,7 @@ def send_debug():
|
||||||
file_list.remove(element)
|
file_list.remove(element)
|
||||||
memory_zip = BytesIO()
|
memory_zip = BytesIO()
|
||||||
with zipfile.ZipFile(memory_zip, 'w', compression=zipfile.ZIP_DEFLATED) as zf:
|
with zipfile.ZipFile(memory_zip, 'w', compression=zipfile.ZIP_DEFLATED) as zf:
|
||||||
zf.writestr('settings.txt', json.dumps(config.toDict(), sort_keys=True, indent=2))
|
zf.writestr('settings.txt', json.dumps(config.to_dict(), sort_keys=True, indent=2))
|
||||||
zf.writestr('libs.txt', json.dumps(collect_stats(), sort_keys=True, indent=2, cls=lazyEncoder))
|
zf.writestr('libs.txt', json.dumps(collect_stats(), sort_keys=True, indent=2, cls=lazyEncoder))
|
||||||
for fp in file_list:
|
for fp in file_list:
|
||||||
zf.write(fp, os.path.basename(fp))
|
zf.write(fp, os.path.basename(fp))
|
||||||
|
|
|
@ -61,7 +61,7 @@ def dependency_check(optional=False):
|
||||||
deps = load_dependencies(optional)
|
deps = load_dependencies(optional)
|
||||||
for dep in deps:
|
for dep in deps:
|
||||||
try:
|
try:
|
||||||
dep_version_int = [int(x) for x in dep[0].split('.')]
|
dep_version_int = [int(x) if x.isnumeric() else 0 for x in dep[0].split('.')]
|
||||||
low_check = [int(x) for x in dep[3].split('.')]
|
low_check = [int(x) for x in dep[3].split('.')]
|
||||||
high_check = [int(x) for x in dep[5].split('.')]
|
high_check = [int(x) for x in dep[5].split('.')]
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
|
|
|
@ -25,29 +25,43 @@ from datetime import datetime
|
||||||
import json
|
import json
|
||||||
from shutil import copyfile
|
from shutil import copyfile
|
||||||
from uuid import uuid4
|
from uuid import uuid4
|
||||||
from markupsafe import escape # dependency of flask
|
from markupsafe import escape, Markup # dependency of flask
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
# from lxml.etree import ParserError
|
||||||
|
|
||||||
try:
|
#try:
|
||||||
from lxml.html.clean import clean_html
|
# # at least bleach 6.0 is needed -> incomplatible change from list arguments to set arguments
|
||||||
except ImportError:
|
# from bleach import clean_text as clean_html
|
||||||
clean_html = None
|
# BLEACH = True
|
||||||
|
#except ImportError:
|
||||||
|
# try:
|
||||||
|
# BLEACH = False
|
||||||
|
# from nh3 import clean as clean_html
|
||||||
|
# except ImportError:
|
||||||
|
# try:
|
||||||
|
# BLEACH = False
|
||||||
|
# from lxml.html.clean import clean_html
|
||||||
|
# except ImportError:
|
||||||
|
# clean_html = None
|
||||||
|
|
||||||
from flask import Blueprint, request, flash, redirect, url_for, abort, Markup, Response
|
from flask import Blueprint, request, flash, redirect, url_for, abort, Response
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_babel import lazy_gettext as N_
|
from flask_babel import lazy_gettext as N_
|
||||||
from flask_babel import get_locale
|
from flask_babel import get_locale
|
||||||
from flask_login import current_user, login_required
|
from flask_login import current_user, login_required
|
||||||
from sqlalchemy.exc import OperationalError, IntegrityError
|
from sqlalchemy.exc import OperationalError, IntegrityError, InterfaceError
|
||||||
from sqlalchemy.orm.exc import StaleDataError
|
from sqlalchemy.orm.exc import StaleDataError
|
||||||
|
from sqlalchemy.sql.expression import func
|
||||||
|
|
||||||
from . import constants, logger, isoLanguages, gdriveutils, uploader, helper, kobo_sync_status
|
from . import constants, logger, isoLanguages, gdriveutils, uploader, helper, kobo_sync_status
|
||||||
|
from .clean_html import clean_string
|
||||||
from . import config, ub, db, calibre_db
|
from . import config, ub, db, calibre_db
|
||||||
from .services.worker import WorkerThread
|
from .services.worker import WorkerThread
|
||||||
from .tasks.upload import TaskUpload
|
from .tasks.upload import TaskUpload
|
||||||
from .render_template import render_title_template
|
from .render_template import render_title_template
|
||||||
from .usermanagement import login_required_if_no_ano
|
from .usermanagement import login_required_if_no_ano
|
||||||
from .kobo_sync_status import change_archived_books
|
from .kobo_sync_status import change_archived_books
|
||||||
|
from .redirect import get_redirect_location
|
||||||
|
|
||||||
|
|
||||||
editbook = Blueprint('edit-book', __name__)
|
editbook = Blueprint('edit-book', __name__)
|
||||||
|
@ -84,7 +98,7 @@ def delete_book_from_details(book_id):
|
||||||
@editbook.route("/delete/<int:book_id>/<string:book_format>", methods=["POST"])
|
@editbook.route("/delete/<int:book_id>/<string:book_format>", methods=["POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def delete_book_ajax(book_id, book_format):
|
def delete_book_ajax(book_id, book_format):
|
||||||
return delete_book_from_table(book_id, book_format, False)
|
return delete_book_from_table(book_id, book_format, False, request.form.to_dict().get('location', ""))
|
||||||
|
|
||||||
|
|
||||||
@editbook.route("/admin/book/<int:book_id>", methods=['GET'])
|
@editbook.route("/admin/book/<int:book_id>", methods=['GET'])
|
||||||
|
@ -107,7 +121,7 @@ def edit_book(book_id):
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
# Book not found
|
# Book not found
|
||||||
if not book:
|
if not book:
|
||||||
flash(_(u"Oops! Selected book title is unavailable. File does not exist or is not accessible"),
|
flash(_("Oops! Selected book is unavailable. File does not exist or is not accessible"),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for("web.index"))
|
return redirect(url_for("web.index"))
|
||||||
|
|
||||||
|
@ -125,7 +139,7 @@ def edit_book(book_id):
|
||||||
edited_books_id = book.id
|
edited_books_id = book.id
|
||||||
modify_date = True
|
modify_date = True
|
||||||
title_author_error = helper.update_dir_structure(edited_books_id,
|
title_author_error = helper.update_dir_structure(edited_books_id,
|
||||||
config.config_calibre_dir,
|
config.get_book_path(),
|
||||||
input_authors[0],
|
input_authors[0],
|
||||||
renamed_author=renamed)
|
renamed_author=renamed)
|
||||||
if title_author_error:
|
if title_author_error:
|
||||||
|
@ -151,7 +165,7 @@ def edit_book(book_id):
|
||||||
if to_save.get("cover_url", None):
|
if to_save.get("cover_url", None):
|
||||||
if not current_user.role_upload():
|
if not current_user.role_upload():
|
||||||
edit_error = True
|
edit_error = True
|
||||||
flash(_(u"User has no rights to upload cover"), category="error")
|
flash(_("User has no rights to upload cover"), category="error")
|
||||||
if to_save["cover_url"].endswith('/static/generic_cover.jpg'):
|
if to_save["cover_url"].endswith('/static/generic_cover.jpg'):
|
||||||
book.has_cover = 0
|
book.has_cover = 0
|
||||||
else:
|
else:
|
||||||
|
@ -203,6 +217,7 @@ def edit_book(book_id):
|
||||||
if modify_date:
|
if modify_date:
|
||||||
book.last_modified = datetime.utcnow()
|
book.last_modified = datetime.utcnow()
|
||||||
kobo_sync_status.remove_synced_book(edited_books_id, all=True)
|
kobo_sync_status.remove_synced_book(edited_books_id, all=True)
|
||||||
|
calibre_db.set_metadata_dirty(book.id)
|
||||||
|
|
||||||
calibre_db.session.merge(book)
|
calibre_db.session.merge(book)
|
||||||
calibre_db.session.commit()
|
calibre_db.session.commit()
|
||||||
|
@ -222,10 +237,10 @@ def edit_book(book_id):
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
flash(str(e), category="error")
|
flash(str(e), category="error")
|
||||||
return redirect(url_for('web.show_book', book_id=book.id))
|
return redirect(url_for('web.show_book', book_id=book.id))
|
||||||
except (OperationalError, IntegrityError, StaleDataError) as e:
|
except (OperationalError, IntegrityError, StaleDataError, InterfaceError) as e:
|
||||||
log.error_or_exception("Database error: {}".format(e))
|
log.error_or_exception("Database error: {}".format(e))
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig if hasattr(e, "orig") else e), category="error")
|
||||||
return redirect(url_for('web.show_book', book_id=book.id))
|
return redirect(url_for('web.show_book', book_id=book.id))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
|
@ -269,7 +284,7 @@ def upload():
|
||||||
meta.extension.lower())
|
meta.extension.lower())
|
||||||
else:
|
else:
|
||||||
error = helper.update_dir_structure(book_id,
|
error = helper.update_dir_structure(book_id,
|
||||||
config.config_calibre_dir,
|
config.get_book_path(),
|
||||||
input_authors[0],
|
input_authors[0],
|
||||||
meta.file_path,
|
meta.file_path,
|
||||||
title_dir + meta.extension.lower(),
|
title_dir + meta.extension.lower(),
|
||||||
|
@ -277,6 +292,8 @@ def upload():
|
||||||
|
|
||||||
move_coverfile(meta, db_book)
|
move_coverfile(meta, db_book)
|
||||||
|
|
||||||
|
if modify_date:
|
||||||
|
calibre_db.set_metadata_dirty(book_id)
|
||||||
# save data to database, reread data
|
# save data to database, reread data
|
||||||
calibre_db.session.commit()
|
calibre_db.session.commit()
|
||||||
|
|
||||||
|
@ -285,7 +302,7 @@ def upload():
|
||||||
if error:
|
if error:
|
||||||
flash(error, category="error")
|
flash(error, category="error")
|
||||||
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(title))
|
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(title))
|
||||||
upload_text = N_(u"File %(file)s uploaded", file=link)
|
upload_text = N_("File %(file)s uploaded", file=link)
|
||||||
WorkerThread.add(current_user.name, TaskUpload(upload_text, escape(title)))
|
WorkerThread.add(current_user.name, TaskUpload(upload_text, escape(title)))
|
||||||
helper.add_book_to_thumbnail_cache(book_id)
|
helper.add_book_to_thumbnail_cache(book_id)
|
||||||
|
|
||||||
|
@ -299,7 +316,8 @@ def upload():
|
||||||
except (OperationalError, IntegrityError, StaleDataError) as e:
|
except (OperationalError, IntegrityError, StaleDataError) as e:
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error_or_exception("Database error: {}".format(e))
|
log.error_or_exception("Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig if hasattr(e, "orig") else e),
|
||||||
|
category="error")
|
||||||
return Response(json.dumps({"location": url_for("web.index")}), mimetype='application/json')
|
return Response(json.dumps({"location": url_for("web.index")}), mimetype='application/json')
|
||||||
|
|
||||||
|
|
||||||
|
@ -312,19 +330,19 @@ def convert_bookformat(book_id):
|
||||||
book_format_to = request.form.get('book_format_to', None)
|
book_format_to = request.form.get('book_format_to', None)
|
||||||
|
|
||||||
if (book_format_from is None) or (book_format_to is None):
|
if (book_format_from is None) or (book_format_to is None):
|
||||||
flash(_(u"Source or destination format for conversion missing"), category="error")
|
flash(_("Source or destination format for conversion missing"), category="error")
|
||||||
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
||||||
|
|
||||||
log.info('converting: book id: %s from: %s to: %s', book_id, book_format_from, book_format_to)
|
log.info('converting: book id: %s from: %s to: %s', book_id, book_format_from, book_format_to)
|
||||||
rtn = helper.convert_book_format(book_id, config.config_calibre_dir, book_format_from.upper(),
|
rtn = helper.convert_book_format(book_id, config.get_book_path(), book_format_from.upper(),
|
||||||
book_format_to.upper(), current_user.name)
|
book_format_to.upper(), current_user.name)
|
||||||
|
|
||||||
if rtn is None:
|
if rtn is None:
|
||||||
flash(_(u"Book successfully queued for converting to %(book_format)s",
|
flash(_("Book successfully queued for converting to %(book_format)s",
|
||||||
book_format=book_format_to),
|
book_format=book_format_to),
|
||||||
category="success")
|
category="success")
|
||||||
else:
|
else:
|
||||||
flash(_(u"There was an error converting this book: %(res)s", res=rtn), category="error")
|
flash(_("There was an error converting this book: %(res)s", res=rtn), category="error")
|
||||||
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
||||||
|
|
||||||
|
|
||||||
|
@ -386,7 +404,7 @@ def edit_list_book(param):
|
||||||
elif param == 'title':
|
elif param == 'title':
|
||||||
sort_param = book.sort
|
sort_param = book.sort
|
||||||
if handle_title_on_edit(book, vals.get('value', "")):
|
if handle_title_on_edit(book, vals.get('value', "")):
|
||||||
rename_error = helper.update_dir_structure(book.id, config.config_calibre_dir)
|
rename_error = helper.update_dir_structure(book.id, config.get_book_path())
|
||||||
if not rename_error:
|
if not rename_error:
|
||||||
ret = Response(json.dumps({'success': True, 'newValue': book.title}),
|
ret = Response(json.dumps({'success': True, 'newValue': book.title}),
|
||||||
mimetype='application/json')
|
mimetype='application/json')
|
||||||
|
@ -404,7 +422,7 @@ def edit_list_book(param):
|
||||||
mimetype='application/json')
|
mimetype='application/json')
|
||||||
elif param == 'authors':
|
elif param == 'authors':
|
||||||
input_authors, __, renamed = handle_author_on_edit(book, vals['value'], vals.get('checkA', None) == "true")
|
input_authors, __, renamed = handle_author_on_edit(book, vals['value'], vals.get('checkA', None) == "true")
|
||||||
rename_error = helper.update_dir_structure(book.id, config.config_calibre_dir, input_authors[0],
|
rename_error = helper.update_dir_structure(book.id, config.get_book_path(), input_authors[0],
|
||||||
renamed_author=renamed)
|
renamed_author=renamed)
|
||||||
if not rename_error:
|
if not rename_error:
|
||||||
ret = Response(json.dumps({
|
ret = Response(json.dumps({
|
||||||
|
@ -448,7 +466,7 @@ def edit_list_book(param):
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error_or_exception("Database error: {}".format(e))
|
log.error_or_exception("Database error: {}".format(e))
|
||||||
ret = Response(json.dumps({'success': False,
|
ret = Response(json.dumps({'success': False,
|
||||||
'msg': 'Database error: {}'.format(e.orig)}),
|
'msg': 'Database error: {}'.format(e.orig if hasattr(e, "orig") else e)}),
|
||||||
mimetype='application/json')
|
mimetype='application/json')
|
||||||
return ret
|
return ret
|
||||||
|
|
||||||
|
@ -466,7 +484,7 @@ def get_sorted_entry(field, bookid):
|
||||||
if field == 'sort':
|
if field == 'sort':
|
||||||
return json.dumps({'sort': book.title})
|
return json.dumps({'sort': book.title})
|
||||||
if field == 'author_sort':
|
if field == 'author_sort':
|
||||||
return json.dumps({'author_sort': book.author})
|
return json.dumps({'authors': " & ".join([a.name for a in calibre_db.order_authors([book])])})
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
@ -508,10 +526,10 @@ def merge_list_book():
|
||||||
for element in from_book.data:
|
for element in from_book.data:
|
||||||
if element.format not in to_file:
|
if element.format not in to_file:
|
||||||
# create new data entry with: book_id, book_format, uncompressed_size, name
|
# create new data entry with: book_id, book_format, uncompressed_size, name
|
||||||
filepath_new = os.path.normpath(os.path.join(config.config_calibre_dir,
|
filepath_new = os.path.normpath(os.path.join(config.get_book_path(),
|
||||||
to_book.path,
|
to_book.path,
|
||||||
to_name + "." + element.format.lower()))
|
to_name + "." + element.format.lower()))
|
||||||
filepath_old = os.path.normpath(os.path.join(config.config_calibre_dir,
|
filepath_old = os.path.normpath(os.path.join(config.get_book_path(),
|
||||||
from_book.path,
|
from_book.path,
|
||||||
element.name + "." + element.format.lower()))
|
element.name + "." + element.format.lower()))
|
||||||
copyfile(filepath_old, filepath_new)
|
copyfile(filepath_old, filepath_new)
|
||||||
|
@ -551,15 +569,16 @@ def table_xchange_author_title():
|
||||||
|
|
||||||
if edited_books_id:
|
if edited_books_id:
|
||||||
# toDo: Handle error
|
# toDo: Handle error
|
||||||
edit_error = helper.update_dir_structure(edited_books_id, config.config_calibre_dir, input_authors[0],
|
edit_error = helper.update_dir_structure(edited_books_id, config.get_book_path(), input_authors[0],
|
||||||
renamed_author=renamed)
|
renamed_author=renamed)
|
||||||
if modify_date:
|
if modify_date:
|
||||||
book.last_modified = datetime.utcnow()
|
book.last_modified = datetime.utcnow()
|
||||||
|
calibre_db.set_metadata_dirty(book.id)
|
||||||
try:
|
try:
|
||||||
calibre_db.session.commit()
|
calibre_db.session.commit()
|
||||||
except (OperationalError, IntegrityError, StaleDataError) as e:
|
except (OperationalError, IntegrityError, StaleDataError) as e:
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error_or_exception("Database error: %s", e)
|
log.error_or_exception("Database error: {}".format(e))
|
||||||
return json.dumps({'success': False})
|
return json.dumps({'success': False})
|
||||||
|
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
|
@ -569,9 +588,9 @@ def table_xchange_author_title():
|
||||||
|
|
||||||
|
|
||||||
def merge_metadata(to_save, meta):
|
def merge_metadata(to_save, meta):
|
||||||
if to_save.get('author_name', "") == _(u'Unknown'):
|
if to_save.get('author_name', "") == _('Unknown'):
|
||||||
to_save['author_name'] = ''
|
to_save['author_name'] = ''
|
||||||
if to_save.get('book_title', "") == _(u'Unknown'):
|
if to_save.get('book_title', "") == _('Unknown'):
|
||||||
to_save['book_title'] = ''
|
to_save['book_title'] = ''
|
||||||
for s_field, m_field in [
|
for s_field, m_field in [
|
||||||
('tags', 'tags'), ('author_name', 'author'), ('series', 'series'),
|
('tags', 'tags'), ('author_name', 'author'), ('series', 'series'),
|
||||||
|
@ -593,6 +612,8 @@ def identifier_list(to_save, book):
|
||||||
val_key = id_val_prefix + type_key[len(id_type_prefix):]
|
val_key = id_val_prefix + type_key[len(id_type_prefix):]
|
||||||
if val_key not in to_save.keys():
|
if val_key not in to_save.keys():
|
||||||
continue
|
continue
|
||||||
|
if to_save[val_key].startswith("data:"):
|
||||||
|
to_save[val_key], __, __ = str.partition(to_save[val_key], ",")
|
||||||
result.append(db.Identifiers(to_save[val_key], type_value, book.id))
|
result.append(db.Identifiers(to_save[val_key], type_value, book.id))
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
@ -607,7 +628,7 @@ def prepare_authors(authr):
|
||||||
|
|
||||||
# we have all author names now
|
# we have all author names now
|
||||||
if input_authors == ['']:
|
if input_authors == ['']:
|
||||||
input_authors = [_(u'Unknown')] # prevent empty Author
|
input_authors = [_('Unknown')] # prevent empty Author
|
||||||
|
|
||||||
renamed = list()
|
renamed = list()
|
||||||
for in_aut in input_authors:
|
for in_aut in input_authors:
|
||||||
|
@ -624,11 +645,11 @@ def prepare_authors(authr):
|
||||||
|
|
||||||
|
|
||||||
def prepare_authors_on_upload(title, authr):
|
def prepare_authors_on_upload(title, authr):
|
||||||
if title != _(u'Unknown') and authr != _(u'Unknown'):
|
if title != _('Unknown') and authr != _('Unknown'):
|
||||||
entry = calibre_db.check_exists_book(authr, title)
|
entry = calibre_db.check_exists_book(authr, title)
|
||||||
if entry:
|
if entry:
|
||||||
log.info("Uploaded book probably exists in library")
|
log.info("Uploaded book probably exists in library")
|
||||||
flash(_(u"Uploaded book probably exists in the library, consider to change before upload new: ")
|
flash(_("Uploaded book probably exists in the library, consider to change before upload new: ")
|
||||||
+ Markup(render_title_template('book_exists_flash.html', entry=entry)), category="warning")
|
+ Markup(render_title_template('book_exists_flash.html', entry=entry)), category="warning")
|
||||||
|
|
||||||
input_authors, renamed = prepare_authors(authr)
|
input_authors, renamed = prepare_authors(authr)
|
||||||
|
@ -683,7 +704,7 @@ def create_book_on_upload(modify_date, meta):
|
||||||
modify_date |= edit_book_languages(meta.languages, db_book, upload_mode=True, invalid=invalid)
|
modify_date |= edit_book_languages(meta.languages, db_book, upload_mode=True, invalid=invalid)
|
||||||
if invalid:
|
if invalid:
|
||||||
for lang in invalid:
|
for lang in invalid:
|
||||||
flash(_(u"'%(langname)s' is not a valid language", langname=lang), category="warning")
|
flash(_("'%(langname)s' is not a valid language", langname=lang), category="warning")
|
||||||
|
|
||||||
# handle tags
|
# handle tags
|
||||||
modify_date |= edit_book_tags(meta.tags, db_book)
|
modify_date |= edit_book_tags(meta.tags, db_book)
|
||||||
|
@ -733,7 +754,7 @@ def file_handling_on_upload(requested_file):
|
||||||
meta = uploader.upload(requested_file, config.config_rarfile_location)
|
meta = uploader.upload(requested_file, config.config_rarfile_location)
|
||||||
except (IOError, OSError):
|
except (IOError, OSError):
|
||||||
log.error("File %s could not saved to temp dir", requested_file.filename)
|
log.error("File %s could not saved to temp dir", requested_file.filename)
|
||||||
flash(_(u"File %(filename)s could not saved to temp dir",
|
flash(_("File %(filename)s could not saved to temp dir",
|
||||||
filename=requested_file.filename), category="error")
|
filename=requested_file.filename), category="error")
|
||||||
return None, Response(json.dumps({"location": url_for("web.index")}), mimetype='application/json')
|
return None, Response(json.dumps({"location": url_for("web.index")}), mimetype='application/json')
|
||||||
return meta, None
|
return meta, None
|
||||||
|
@ -745,7 +766,7 @@ def move_coverfile(meta, db_book):
|
||||||
cover_file = meta.cover
|
cover_file = meta.cover
|
||||||
else:
|
else:
|
||||||
cover_file = os.path.join(constants.STATIC_DIR, 'generic_cover.jpg')
|
cover_file = os.path.join(constants.STATIC_DIR, 'generic_cover.jpg')
|
||||||
new_cover_path = os.path.join(config.config_calibre_dir, db_book.path)
|
new_cover_path = os.path.join(config.get_book_path(), db_book.path)
|
||||||
try:
|
try:
|
||||||
os.makedirs(new_cover_path, exist_ok=True)
|
os.makedirs(new_cover_path, exist_ok=True)
|
||||||
copyfile(cover_file, os.path.join(new_cover_path, "cover.jpg"))
|
copyfile(cover_file, os.path.join(new_cover_path, "cover.jpg"))
|
||||||
|
@ -753,7 +774,7 @@ def move_coverfile(meta, db_book):
|
||||||
os.unlink(meta.cover)
|
os.unlink(meta.cover)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
log.error("Failed to move cover file %s: %s", new_cover_path, e)
|
log.error("Failed to move cover file %s: %s", new_cover_path, e)
|
||||||
flash(_(u"Failed to Move Cover File %(file)s: %(error)s", file=new_cover_path,
|
flash(_("Failed to Move Cover File %(file)s: %(error)s", file=new_cover_path,
|
||||||
error=e),
|
error=e),
|
||||||
category="error")
|
category="error")
|
||||||
|
|
||||||
|
@ -767,7 +788,7 @@ def delete_whole_book(book_id, book):
|
||||||
|
|
||||||
# check if only this book links to:
|
# check if only this book links to:
|
||||||
# author, language, series, tags, custom columns
|
# author, language, series, tags, custom columns
|
||||||
modify_database_object([u''], book.authors, db.Authors, calibre_db.session, 'author')
|
modify_database_object([''], book.authors, db.Authors, calibre_db.session, 'author')
|
||||||
modify_database_object([u''], book.tags, db.Tags, calibre_db.session, 'tags')
|
modify_database_object([u''], book.tags, db.Tags, calibre_db.session, 'tags')
|
||||||
modify_database_object([u''], book.series, db.Series, calibre_db.session, 'series')
|
modify_database_object([u''], book.series, db.Series, calibre_db.session, 'series')
|
||||||
modify_database_object([u''], book.languages, db.Languages, calibre_db.session, 'languages')
|
modify_database_object([u''], book.languages, db.Languages, calibre_db.session, 'languages')
|
||||||
|
@ -804,7 +825,7 @@ def delete_whole_book(book_id, book):
|
||||||
calibre_db.session.query(db.Books).filter(db.Books.id == book_id).delete()
|
calibre_db.session.query(db.Books).filter(db.Books.id == book_id).delete()
|
||||||
|
|
||||||
|
|
||||||
def render_delete_book_result(book_format, json_response, warning, book_id):
|
def render_delete_book_result(book_format, json_response, warning, book_id, location=""):
|
||||||
if book_format:
|
if book_format:
|
||||||
if json_response:
|
if json_response:
|
||||||
return json.dumps([warning, {"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
return json.dumps([warning, {"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
||||||
|
@ -816,22 +837,22 @@ def render_delete_book_result(book_format, json_response, warning, book_id):
|
||||||
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
return redirect(url_for('edit-book.show_edit_book', book_id=book_id))
|
||||||
else:
|
else:
|
||||||
if json_response:
|
if json_response:
|
||||||
return json.dumps([warning, {"location": url_for('web.index'),
|
return json.dumps([warning, {"location": get_redirect_location(location, "web.index"),
|
||||||
"type": "success",
|
"type": "success",
|
||||||
"format": book_format,
|
"format": book_format,
|
||||||
"message": _('Book Successfully Deleted')}])
|
"message": _('Book Successfully Deleted')}])
|
||||||
else:
|
else:
|
||||||
flash(_('Book Successfully Deleted'), category="success")
|
flash(_('Book Successfully Deleted'), category="success")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(get_redirect_location(location, "web.index"))
|
||||||
|
|
||||||
|
|
||||||
def delete_book_from_table(book_id, book_format, json_response):
|
def delete_book_from_table(book_id, book_format, json_response, location=""):
|
||||||
warning = {}
|
warning = {}
|
||||||
if current_user.role_delete_books():
|
if current_user.role_delete_books():
|
||||||
book = calibre_db.get_book(book_id)
|
book = calibre_db.get_book(book_id)
|
||||||
if book:
|
if book:
|
||||||
try:
|
try:
|
||||||
result, error = helper.delete_book(book, config.config_calibre_dir, book_format=book_format.upper())
|
result, error = helper.delete_book(book, config.get_book_path(), book_format=book_format.upper())
|
||||||
if not result:
|
if not result:
|
||||||
if json_response:
|
if json_response:
|
||||||
return json.dumps([{"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
return json.dumps([{"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
||||||
|
@ -872,7 +893,7 @@ def delete_book_from_table(book_id, book_format, json_response):
|
||||||
else:
|
else:
|
||||||
# book not found
|
# book not found
|
||||||
log.error('Book with id "%s" could not be deleted: not found', book_id)
|
log.error('Book with id "%s" could not be deleted: not found', book_id)
|
||||||
return render_delete_book_result(book_format, json_response, warning, book_id)
|
return render_delete_book_result(book_format, json_response, warning, book_id, location)
|
||||||
message = _("You are missing permissions to delete books")
|
message = _("You are missing permissions to delete books")
|
||||||
if json_response:
|
if json_response:
|
||||||
return json.dumps({"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
return json.dumps({"location": url_for("edit-book.show_edit_book", book_id=book_id),
|
||||||
|
@ -888,7 +909,7 @@ def render_edit_book(book_id):
|
||||||
cc = calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.datatype.notin_(db.cc_exceptions)).all()
|
cc = calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.datatype.notin_(db.cc_exceptions)).all()
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
if not book:
|
if not book:
|
||||||
flash(_(u"Oops! Selected book title is unavailable. File does not exist or is not accessible"),
|
flash(_("Oops! Selected book is unavailable. File does not exist or is not accessible"),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for("web.index"))
|
return redirect(url_for("web.index"))
|
||||||
|
|
||||||
|
@ -923,7 +944,7 @@ def render_edit_book(book_id):
|
||||||
if kepub_possible:
|
if kepub_possible:
|
||||||
allowed_conversion_formats.append('kepub')
|
allowed_conversion_formats.append('kepub')
|
||||||
return render_title_template('book_edit.html', book=book, authors=author_names, cc=cc,
|
return render_title_template('book_edit.html', book=book, authors=author_names, cc=cc,
|
||||||
title=_(u"edit metadata"), page="editbook",
|
title=_("edit metadata"), page="editbook",
|
||||||
conversion_formats=allowed_conversion_formats,
|
conversion_formats=allowed_conversion_formats,
|
||||||
config=config,
|
config=config,
|
||||||
source_formats=valid_source_formats)
|
source_formats=valid_source_formats)
|
||||||
|
@ -984,7 +1005,18 @@ def edit_book_series_index(series_index, book):
|
||||||
def edit_book_comments(comments, book):
|
def edit_book_comments(comments, book):
|
||||||
modify_date = False
|
modify_date = False
|
||||||
if comments:
|
if comments:
|
||||||
comments = clean_html(comments)
|
comments = clean_string(comments, book.id)
|
||||||
|
#try:
|
||||||
|
# if BLEACH:
|
||||||
|
# comments = clean_html(comments, tags=set(), attributes=set())
|
||||||
|
# else:
|
||||||
|
# comments = clean_html(comments)
|
||||||
|
#except ParserError as e:
|
||||||
|
# log.error("Comments of book {} are corrupted: {}".format(book.id, e))
|
||||||
|
# comments = ""
|
||||||
|
#except TypeError as e:
|
||||||
|
# log.error("Comments can't be parsed, maybe 'lxml' is too new, try installing 'bleach': {}".format(e))
|
||||||
|
# comments = ""
|
||||||
if len(book.comments):
|
if len(book.comments):
|
||||||
if book.comments[0].text != comments:
|
if book.comments[0].text != comments:
|
||||||
book.comments[0].text = comments
|
book.comments[0].text = comments
|
||||||
|
@ -1008,7 +1040,7 @@ def edit_book_languages(languages, book, upload_mode=False, invalid=None):
|
||||||
if isinstance(invalid, list):
|
if isinstance(invalid, list):
|
||||||
invalid.append(lang)
|
invalid.append(lang)
|
||||||
else:
|
else:
|
||||||
raise ValueError(_(u"'%(langname)s' is not a valid language", langname=lang))
|
raise ValueError(_("'%(langname)s' is not a valid language", langname=lang))
|
||||||
# ToDo: Not working correct
|
# ToDo: Not working correct
|
||||||
if upload_mode and len(input_l) == 1:
|
if upload_mode and len(input_l) == 1:
|
||||||
# If the language of the file is excluded from the users view, it's not imported, to allow the user to view
|
# If the language of the file is excluded from the users view, it's not imported, to allow the user to view
|
||||||
|
@ -1042,7 +1074,19 @@ def edit_cc_data_value(book_id, book, c, to_save, cc_db_value, cc_string):
|
||||||
elif c.datatype == 'comments':
|
elif c.datatype == 'comments':
|
||||||
to_save[cc_string] = Markup(to_save[cc_string]).unescape()
|
to_save[cc_string] = Markup(to_save[cc_string]).unescape()
|
||||||
if to_save[cc_string]:
|
if to_save[cc_string]:
|
||||||
to_save[cc_string] = clean_html(to_save[cc_string])
|
to_save[cc_string] = clean_string(to_save[cc_string], book_id)
|
||||||
|
#try:
|
||||||
|
# if BLEACH:
|
||||||
|
# to_save[cc_string] = clean_html(to_save[cc_string], tags=set(), attributes=set())
|
||||||
|
# else:
|
||||||
|
# to_save[cc_string] = clean_html(to_save[cc_string])
|
||||||
|
#except ParserError as e:
|
||||||
|
# log.error("Customs Comments of book {} are corrupted: {}".format(book_id, e))
|
||||||
|
# to_save[cc_string] = ""
|
||||||
|
#except TypeError as e:
|
||||||
|
# to_save[cc_string] = ""
|
||||||
|
# log.error("Customs Comments can't be parsed, maybe 'lxml' is too new, "
|
||||||
|
# "try installing 'bleach': {}".format(e))
|
||||||
elif c.datatype == 'datetime':
|
elif c.datatype == 'datetime':
|
||||||
try:
|
try:
|
||||||
to_save[cc_string] = datetime.strptime(to_save[cc_string], "%Y-%m-%d")
|
to_save[cc_string] = datetime.strptime(to_save[cc_string], "%Y-%m-%d")
|
||||||
|
@ -1119,9 +1163,10 @@ def edit_cc_data(book_id, book, to_save, cc):
|
||||||
cc_db_value = None
|
cc_db_value = None
|
||||||
if to_save[cc_string].strip():
|
if to_save[cc_string].strip():
|
||||||
if c.datatype in ['int', 'bool', 'float', "datetime", "comments"]:
|
if c.datatype in ['int', 'bool', 'float', "datetime", "comments"]:
|
||||||
changed, to_save = edit_cc_data_value(book_id, book, c, to_save, cc_db_value, cc_string)
|
change, to_save = edit_cc_data_value(book_id, book, c, to_save, cc_db_value, cc_string)
|
||||||
else:
|
else:
|
||||||
changed, to_save = edit_cc_data_string(book, c, to_save, cc_db_value, cc_string)
|
change, to_save = edit_cc_data_string(book, c, to_save, cc_db_value, cc_string)
|
||||||
|
changed |= change
|
||||||
else:
|
else:
|
||||||
if cc_db_value is not None:
|
if cc_db_value is not None:
|
||||||
# remove old cc_val
|
# remove old cc_val
|
||||||
|
@ -1150,7 +1195,7 @@ def upload_single_file(file_request, book, book_id):
|
||||||
# check for empty request
|
# check for empty request
|
||||||
if requested_file.filename != '':
|
if requested_file.filename != '':
|
||||||
if not current_user.role_upload():
|
if not current_user.role_upload():
|
||||||
flash(_(u"User has no rights to upload additional file formats"), category="error")
|
flash(_("User has no rights to upload additional file formats"), category="error")
|
||||||
return False
|
return False
|
||||||
if '.' in requested_file.filename:
|
if '.' in requested_file.filename:
|
||||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||||
|
@ -1163,7 +1208,7 @@ def upload_single_file(file_request, book, book_id):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
file_name = book.path.rsplit('/', 1)[-1]
|
file_name = book.path.rsplit('/', 1)[-1]
|
||||||
filepath = os.path.normpath(os.path.join(config.config_calibre_dir, book.path))
|
filepath = os.path.normpath(os.path.join(config.get_book_path(), book.path))
|
||||||
saved_filename = os.path.join(filepath, file_name + '.' + file_ext)
|
saved_filename = os.path.join(filepath, file_name + '.' + file_ext)
|
||||||
|
|
||||||
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
||||||
|
@ -1171,12 +1216,12 @@ def upload_single_file(file_request, book, book_id):
|
||||||
try:
|
try:
|
||||||
os.makedirs(filepath)
|
os.makedirs(filepath)
|
||||||
except OSError:
|
except OSError:
|
||||||
flash(_(u"Failed to create path %(path)s (Permission denied).", path=filepath), category="error")
|
flash(_("Failed to create path %(path)s (Permission denied).", path=filepath), category="error")
|
||||||
return False
|
return False
|
||||||
try:
|
try:
|
||||||
requested_file.save(saved_filename)
|
requested_file.save(saved_filename)
|
||||||
except OSError:
|
except OSError:
|
||||||
flash(_(u"Failed to store file %(file)s.", file=saved_filename), category="error")
|
flash(_("Failed to store file %(file)s.", file=saved_filename), category="error")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
file_size = os.path.getsize(saved_filename)
|
file_size = os.path.getsize(saved_filename)
|
||||||
|
@ -1194,17 +1239,18 @@ def upload_single_file(file_request, book, book_id):
|
||||||
except (OperationalError, IntegrityError, StaleDataError) as e:
|
except (OperationalError, IntegrityError, StaleDataError) as e:
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error_or_exception("Database error: {}".format(e))
|
log.error_or_exception("Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig if hasattr(e, "orig") else e),
|
||||||
|
category="error")
|
||||||
return False # return redirect(url_for('web.show_book', book_id=book.id))
|
return False # return redirect(url_for('web.show_book', book_id=book.id))
|
||||||
|
|
||||||
# Queue uploader info
|
# Queue uploader info
|
||||||
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book.id), escape(book.title))
|
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book.id), escape(book.title))
|
||||||
upload_text = N_(u"File format %(ext)s added to %(book)s", ext=file_ext.upper(), book=link)
|
upload_text = N_("File format %(ext)s added to %(book)s", ext=file_ext.upper(), book=link)
|
||||||
WorkerThread.add(current_user.name, TaskUpload(upload_text, escape(book.title)))
|
WorkerThread.add(current_user.name, TaskUpload(upload_text, escape(book.title)))
|
||||||
|
|
||||||
return uploader.process(
|
return uploader.process(
|
||||||
saved_filename, *os.path.splitext(requested_file.filename),
|
saved_filename, *os.path.splitext(requested_file.filename),
|
||||||
rarExecutable=config.config_rarfile_location)
|
rar_executable=config.config_rarfile_location)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
@ -1214,7 +1260,7 @@ def upload_cover(cover_request, book):
|
||||||
# check for empty request
|
# check for empty request
|
||||||
if requested_file.filename != '':
|
if requested_file.filename != '':
|
||||||
if not current_user.role_upload():
|
if not current_user.role_upload():
|
||||||
flash(_(u"User has no rights to upload cover"), category="error")
|
flash(_("User has no rights to upload cover"), category="error")
|
||||||
return False
|
return False
|
||||||
ret, message = helper.save_cover(requested_file, book.path)
|
ret, message = helper.save_cover(requested_file, book.path)
|
||||||
if ret is True:
|
if ret is True:
|
||||||
|
@ -1238,18 +1284,18 @@ def handle_title_on_edit(book, book_title):
|
||||||
|
|
||||||
|
|
||||||
def handle_author_on_edit(book, author_name, update_stored=True):
|
def handle_author_on_edit(book, author_name, update_stored=True):
|
||||||
|
change = False
|
||||||
# handle author(s)
|
# handle author(s)
|
||||||
input_authors, renamed = prepare_authors(author_name)
|
input_authors, renamed = prepare_authors(author_name)
|
||||||
|
|
||||||
change = modify_database_object(input_authors, book.authors, db.Authors, calibre_db.session, 'author')
|
# change |= modify_database_object(input_authors, book.authors, db.Authors, calibre_db.session, 'author')
|
||||||
|
|
||||||
# Search for each author if author is in database, if not, author name and sorted author name is generated new
|
# Search for each author if author is in database, if not, author name and sorted author name is generated new
|
||||||
# everything then is assembled for sorted author field in database
|
# everything then is assembled for sorted author field in database
|
||||||
sort_authors_list = list()
|
sort_authors_list = list()
|
||||||
for inp in input_authors:
|
for inp in input_authors:
|
||||||
stored_author = calibre_db.session.query(db.Authors).filter(db.Authors.name == inp).first()
|
stored_author = calibre_db.session.query(db.Authors).filter(db.Authors.name == inp).first()
|
||||||
if not stored_author:
|
if not stored_author:
|
||||||
stored_author = helper.get_sorted_author(inp)
|
stored_author = helper.get_sorted_author(inp.replace('|', ','))
|
||||||
else:
|
else:
|
||||||
stored_author = stored_author.sort
|
stored_author = stored_author.sort
|
||||||
sort_authors_list.append(helper.get_sorted_author(stored_author))
|
sort_authors_list.append(helper.get_sorted_author(stored_author))
|
||||||
|
@ -1257,6 +1303,9 @@ def handle_author_on_edit(book, author_name, update_stored=True):
|
||||||
if book.author_sort != sort_authors and update_stored:
|
if book.author_sort != sort_authors and update_stored:
|
||||||
book.author_sort = sort_authors
|
book.author_sort = sort_authors
|
||||||
change = True
|
change = True
|
||||||
|
|
||||||
|
change |= modify_database_object(input_authors, book.authors, db.Authors, calibre_db.session, 'author')
|
||||||
|
|
||||||
return input_authors, change, renamed
|
return input_authors, change, renamed
|
||||||
|
|
||||||
|
|
||||||
|
@ -1264,14 +1313,15 @@ def search_objects_remove(db_book_object, db_type, input_elements):
|
||||||
del_elements = []
|
del_elements = []
|
||||||
for c_elements in db_book_object:
|
for c_elements in db_book_object:
|
||||||
found = False
|
found = False
|
||||||
if db_type == 'languages':
|
#if db_type == 'languages':
|
||||||
type_elements = c_elements.lang_code
|
# type_elements = c_elements.lang_code
|
||||||
elif db_type == 'custom':
|
if db_type == 'custom':
|
||||||
type_elements = c_elements.value
|
type_elements = c_elements.value
|
||||||
else:
|
else:
|
||||||
type_elements = c_elements.name
|
# type_elements = c_elements.name
|
||||||
|
type_elements = c_elements
|
||||||
for inp_element in input_elements:
|
for inp_element in input_elements:
|
||||||
if inp_element.lower() == type_elements.lower():
|
if type_elements == inp_element:
|
||||||
found = True
|
found = True
|
||||||
break
|
break
|
||||||
# if the element was not found in the new list, add it to remove list
|
# if the element was not found in the new list, add it to remove list
|
||||||
|
@ -1285,13 +1335,11 @@ def search_objects_add(db_book_object, db_type, input_elements):
|
||||||
for inp_element in input_elements:
|
for inp_element in input_elements:
|
||||||
found = False
|
found = False
|
||||||
for c_elements in db_book_object:
|
for c_elements in db_book_object:
|
||||||
if db_type == 'languages':
|
if db_type == 'custom':
|
||||||
type_elements = c_elements.lang_code
|
|
||||||
elif db_type == 'custom':
|
|
||||||
type_elements = c_elements.value
|
type_elements = c_elements.value
|
||||||
else:
|
else:
|
||||||
type_elements = c_elements.name
|
type_elements = c_elements
|
||||||
if inp_element == type_elements:
|
if type_elements == inp_element:
|
||||||
found = True
|
found = True
|
||||||
break
|
break
|
||||||
if not found:
|
if not found:
|
||||||
|
@ -1307,6 +1355,7 @@ def remove_objects(db_book_object, db_session, del_elements):
|
||||||
changed = True
|
changed = True
|
||||||
if len(del_element.books) == 0:
|
if len(del_element.books) == 0:
|
||||||
db_session.delete(del_element)
|
db_session.delete(del_element)
|
||||||
|
db_session.flush()
|
||||||
return changed
|
return changed
|
||||||
|
|
||||||
|
|
||||||
|
@ -1320,27 +1369,34 @@ def add_objects(db_book_object, db_object, db_session, db_type, add_elements):
|
||||||
db_filter = db_object.name
|
db_filter = db_object.name
|
||||||
for add_element in add_elements:
|
for add_element in add_elements:
|
||||||
# check if an element with that name exists
|
# check if an element with that name exists
|
||||||
db_element = db_session.query(db_object).filter(db_filter == add_element).first()
|
changed = True
|
||||||
|
# db_session.query(db.Tags).filter((func.lower(db.Tags.name).ilike("GênOt"))).all()
|
||||||
|
db_element = db_session.query(db_object).filter((func.lower(db_filter).ilike(add_element))).first()
|
||||||
|
# db_element = db_session.query(db_object).filter(func.lower(db_filter) == add_element.lower()).first()
|
||||||
# if no element is found add it
|
# if no element is found add it
|
||||||
if db_type == 'author':
|
|
||||||
new_element = db_object(add_element, helper.get_sorted_author(add_element.replace('|', ',')), "")
|
|
||||||
elif db_type == 'series':
|
|
||||||
new_element = db_object(add_element, add_element)
|
|
||||||
elif db_type == 'custom':
|
|
||||||
new_element = db_object(value=add_element)
|
|
||||||
elif db_type == 'publisher':
|
|
||||||
new_element = db_object(add_element, None)
|
|
||||||
else: # db_type should be tag or language
|
|
||||||
new_element = db_object(add_element)
|
|
||||||
if db_element is None:
|
if db_element is None:
|
||||||
changed = True
|
if db_type == 'author':
|
||||||
|
new_element = db_object(add_element, helper.get_sorted_author(add_element.replace('|', ',')))
|
||||||
|
elif db_type == 'series':
|
||||||
|
new_element = db_object(add_element, add_element)
|
||||||
|
elif db_type == 'custom':
|
||||||
|
new_element = db_object(value=add_element)
|
||||||
|
elif db_type == 'publisher':
|
||||||
|
new_element = db_object(add_element, None)
|
||||||
|
else: # db_type should be tag or language
|
||||||
|
new_element = db_object(add_element)
|
||||||
db_session.add(new_element)
|
db_session.add(new_element)
|
||||||
db_book_object.append(new_element)
|
db_book_object.append(new_element)
|
||||||
else:
|
else:
|
||||||
db_element = create_objects_for_addition(db_element, add_element, db_type)
|
db_no_case = db_session.query(db_object).filter(db_filter == add_element).first()
|
||||||
|
if db_no_case:
|
||||||
|
# check for new case of element
|
||||||
|
db_element = create_objects_for_addition(db_element, add_element, db_type)
|
||||||
|
else:
|
||||||
|
db_element = create_objects_for_addition(db_element, add_element, db_type)
|
||||||
# add element to book
|
# add element to book
|
||||||
changed = True
|
|
||||||
db_book_object.append(db_element)
|
db_book_object.append(db_element)
|
||||||
|
|
||||||
return changed
|
return changed
|
||||||
|
|
||||||
|
|
||||||
|
@ -1375,13 +1431,24 @@ def modify_database_object(input_elements, db_book_object, db_object, db_session
|
||||||
if not isinstance(input_elements, list):
|
if not isinstance(input_elements, list):
|
||||||
raise TypeError(str(input_elements) + " should be passed as a list")
|
raise TypeError(str(input_elements) + " should be passed as a list")
|
||||||
input_elements = [x for x in input_elements if x != '']
|
input_elements = [x for x in input_elements if x != '']
|
||||||
# we have all input element (authors, series, tags) names now
|
|
||||||
|
changed = False
|
||||||
|
# If elements are renamed (upper lower case), rename it
|
||||||
|
for rec_a, rec_b in zip(db_book_object, input_elements):
|
||||||
|
if db_type == "custom":
|
||||||
|
if rec_a.value.casefold() == rec_b.casefold() and rec_a.value != rec_b:
|
||||||
|
create_objects_for_addition(rec_a, rec_b, db_type)
|
||||||
|
else:
|
||||||
|
if rec_a.get().casefold() == rec_b.casefold() and rec_a.get() != rec_b:
|
||||||
|
create_objects_for_addition(rec_a, rec_b, db_type)
|
||||||
|
# we have all input element (authors, series, tags) names now
|
||||||
# 1. search for elements to remove
|
# 1. search for elements to remove
|
||||||
del_elements = search_objects_remove(db_book_object, db_type, input_elements)
|
del_elements = search_objects_remove(db_book_object, db_type, input_elements)
|
||||||
# 2. search for elements that need to be added
|
# 2. search for elements that need to be added
|
||||||
add_elements = search_objects_add(db_book_object, db_type, input_elements)
|
add_elements = search_objects_add(db_book_object, db_type, input_elements)
|
||||||
|
|
||||||
# if there are elements to remove, we remove them now
|
# if there are elements to remove, we remove them now
|
||||||
changed = remove_objects(db_book_object, db_session, del_elements)
|
changed |= remove_objects(db_book_object, db_session, del_elements)
|
||||||
# if there are elements to add, we add them now!
|
# if there are elements to add, we add them now!
|
||||||
if len(add_elements) > 0:
|
if len(add_elements) > 0:
|
||||||
changed |= add_objects(db_book_object, db_object, db_session, db_type, add_elements)
|
changed |= add_objects(db_book_object, db_object, db_session, db_type, add_elements)
|
||||||
|
|
|
@ -0,0 +1,63 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2024 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
from uuid import uuid4
|
||||||
|
import os
|
||||||
|
|
||||||
|
from .file_helper import get_temp_dir
|
||||||
|
from .subproc_wrapper import process_open
|
||||||
|
from . import logger, config
|
||||||
|
from .constants import SUPPORTED_CALIBRE_BINARIES
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
|
def do_calibre_export(book_id, book_format):
|
||||||
|
try:
|
||||||
|
quotes = [3, 5, 7, 9]
|
||||||
|
tmp_dir = get_temp_dir()
|
||||||
|
calibredb_binarypath = get_calibre_binarypath("calibredb")
|
||||||
|
temp_file_name = str(uuid4())
|
||||||
|
my_env = os.environ.copy()
|
||||||
|
if config.config_calibre_split:
|
||||||
|
my_env['CALIBRE_OVERRIDE_DATABASE_PATH'] = os.path.join(config.config_calibre_dir, "metadata.db")
|
||||||
|
library_path = config.config_calibre_split_dir
|
||||||
|
else:
|
||||||
|
library_path = config.config_calibre_dir
|
||||||
|
opf_command = [calibredb_binarypath, 'export', '--dont-write-opf', '--with-library', library_path,
|
||||||
|
'--to-dir', tmp_dir, '--formats', book_format, "--template", "{}".format(temp_file_name),
|
||||||
|
str(book_id)]
|
||||||
|
p = process_open(opf_command, quotes, my_env)
|
||||||
|
_, err = p.communicate()
|
||||||
|
if err:
|
||||||
|
log.error('Metadata embedder encountered an error: %s', err)
|
||||||
|
return tmp_dir, temp_file_name
|
||||||
|
except OSError as ex:
|
||||||
|
# ToDo real error handling
|
||||||
|
log.error_or_exception(ex)
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
|
||||||
|
def get_calibre_binarypath(binary):
|
||||||
|
binariesdir = config.config_binariesdir
|
||||||
|
if binariesdir:
|
||||||
|
try:
|
||||||
|
return os.path.join(binariesdir, SUPPORTED_CALIBRE_BINARIES[binary])
|
||||||
|
except KeyError as ex:
|
||||||
|
log.error("Binary not supported by Calibre-Web: %s", SUPPORTED_CALIBRE_BINARIES[binary])
|
||||||
|
pass
|
||||||
|
return ""
|
132
cps/epub.py
|
@ -21,25 +21,47 @@ import zipfile
|
||||||
from lxml import etree
|
from lxml import etree
|
||||||
|
|
||||||
from . import isoLanguages, cover
|
from . import isoLanguages, cover
|
||||||
|
from . import config, logger
|
||||||
from .helper import split_authors
|
from .helper import split_authors
|
||||||
|
from .epub_helper import get_content_opf, default_ns
|
||||||
from .constants import BookMeta
|
from .constants import BookMeta
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
def _extract_cover(zip_file, cover_file, cover_path, tmp_file_name):
|
def _extract_cover(zip_file, cover_file, cover_path, tmp_file_name):
|
||||||
if cover_file is None:
|
if cover_file is None:
|
||||||
return None
|
return None
|
||||||
else:
|
|
||||||
cf = extension = None
|
|
||||||
zip_cover_path = os.path.join(cover_path, cover_file).replace('\\', '/')
|
|
||||||
|
|
||||||
prefix = os.path.splitext(tmp_file_name)[0]
|
cf = extension = None
|
||||||
tmp_cover_name = prefix + '.' + os.path.basename(zip_cover_path)
|
zip_cover_path = os.path.join(cover_path, cover_file).replace('\\', '/')
|
||||||
ext = os.path.splitext(tmp_cover_name)
|
|
||||||
if len(ext) > 1:
|
prefix = os.path.splitext(tmp_file_name)[0]
|
||||||
extension = ext[1].lower()
|
tmp_cover_name = prefix + '.' + os.path.basename(zip_cover_path)
|
||||||
if extension in cover.COVER_EXTENSIONS:
|
ext = os.path.splitext(tmp_cover_name)
|
||||||
cf = zip_file.read(zip_cover_path)
|
if len(ext) > 1:
|
||||||
return cover.cover_processing(tmp_file_name, cf, extension)
|
extension = ext[1].lower()
|
||||||
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
|
cf = zip_file.read(zip_cover_path)
|
||||||
|
return cover.cover_processing(tmp_file_name, cf, extension)
|
||||||
|
|
||||||
|
def get_epub_layout(book, book_data):
|
||||||
|
file_path = os.path.normpath(os.path.join(config.get_book_path(),
|
||||||
|
book.path, book_data.name + "." + book_data.format.lower()))
|
||||||
|
|
||||||
|
try:
|
||||||
|
tree, __ = get_content_opf(file_path, default_ns)
|
||||||
|
p = tree.xpath('/pkg:package/pkg:metadata', namespaces=default_ns)[0]
|
||||||
|
|
||||||
|
layout = p.xpath('pkg:meta[@property="rendition:layout"]/text()', namespaces=default_ns)
|
||||||
|
except (etree.XMLSyntaxError, KeyError, IndexError, OSError) as e:
|
||||||
|
log.error("Could not parse epub metadata of book {} during kobo sync: {}".format(book.id, e))
|
||||||
|
layout = []
|
||||||
|
|
||||||
|
if len(layout) == 0:
|
||||||
|
return None
|
||||||
|
else:
|
||||||
|
return layout[0]
|
||||||
|
|
||||||
|
|
||||||
def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
@ -49,13 +71,7 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
'dc': 'http://purl.org/dc/elements/1.1/'
|
'dc': 'http://purl.org/dc/elements/1.1/'
|
||||||
}
|
}
|
||||||
|
|
||||||
epub_zip = zipfile.ZipFile(tmp_file_path)
|
tree, cf_name = get_content_opf(tmp_file_path, ns)
|
||||||
|
|
||||||
txt = epub_zip.read('META-INF/container.xml')
|
|
||||||
tree = etree.fromstring(txt)
|
|
||||||
cf_name = tree.xpath('n:rootfiles/n:rootfile/@full-path', namespaces=ns)[0]
|
|
||||||
cf = epub_zip.read(cf_name)
|
|
||||||
tree = etree.fromstring(cf)
|
|
||||||
|
|
||||||
cover_path = os.path.dirname(cf_name)
|
cover_path = os.path.dirname(cf_name)
|
||||||
|
|
||||||
|
@ -73,20 +89,20 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
elif s == 'date':
|
elif s == 'date':
|
||||||
epub_metadata[s] = tmp[0][:10]
|
epub_metadata[s] = tmp[0][:10]
|
||||||
else:
|
else:
|
||||||
epub_metadata[s] = tmp[0]
|
epub_metadata[s] = tmp[0].strip()
|
||||||
else:
|
else:
|
||||||
epub_metadata[s] = 'Unknown'
|
epub_metadata[s] = 'Unknown'
|
||||||
|
|
||||||
if epub_metadata['subject'] == 'Unknown':
|
if epub_metadata['subject'] == 'Unknown':
|
||||||
epub_metadata['subject'] = ''
|
epub_metadata['subject'] = ''
|
||||||
|
|
||||||
if epub_metadata['publisher'] == u'Unknown':
|
if epub_metadata['publisher'] == 'Unknown':
|
||||||
epub_metadata['publisher'] = ''
|
epub_metadata['publisher'] = ''
|
||||||
|
|
||||||
if epub_metadata['date'] == u'Unknown':
|
if epub_metadata['date'] == 'Unknown':
|
||||||
epub_metadata['date'] = ''
|
epub_metadata['date'] = ''
|
||||||
|
|
||||||
if epub_metadata['description'] == u'Unknown':
|
if epub_metadata['description'] == 'Unknown':
|
||||||
description = tree.xpath("//*[local-name() = 'description']/text()")
|
description = tree.xpath("//*[local-name() = 'description']/text()")
|
||||||
if len(description) > 0:
|
if len(description) > 0:
|
||||||
epub_metadata['description'] = description
|
epub_metadata['description'] = description
|
||||||
|
@ -98,15 +114,19 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
|
||||||
epub_metadata = parse_epub_series(ns, tree, epub_metadata)
|
epub_metadata = parse_epub_series(ns, tree, epub_metadata)
|
||||||
|
|
||||||
|
epub_zip = zipfile.ZipFile(tmp_file_path)
|
||||||
cover_file = parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path)
|
cover_file = parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path)
|
||||||
|
|
||||||
identifiers = []
|
identifiers = []
|
||||||
for node in p.xpath('dc:identifier', namespaces=ns):
|
for node in p.xpath('dc:identifier', namespaces=ns):
|
||||||
identifier_name=node.attrib.values()[-1];
|
try:
|
||||||
identifier_value=node.text;
|
identifier_name = node.attrib.values()[-1]
|
||||||
if identifier_name in ('uuid','calibre'):
|
except IndexError:
|
||||||
continue;
|
continue
|
||||||
identifiers.append( [identifier_name, identifier_value] )
|
identifier_value = node.text
|
||||||
|
if identifier_name in ('uuid', 'calibre') or identifier_value is None:
|
||||||
|
continue
|
||||||
|
identifiers.append([identifier_name, identifier_value])
|
||||||
|
|
||||||
if not epub_metadata['title']:
|
if not epub_metadata['title']:
|
||||||
title = original_file_name
|
title = original_file_name
|
||||||
|
@ -131,40 +151,40 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
|
||||||
def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
||||||
cover_section = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
cover_section = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
||||||
cover_file = None
|
|
||||||
# if len(cover_section) > 0:
|
|
||||||
for cs in cover_section:
|
for cs in cover_section:
|
||||||
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
||||||
if cover_file:
|
if cover_file:
|
||||||
break
|
return cover_file
|
||||||
if not cover_file:
|
|
||||||
meta_cover = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='cover']/@content", namespaces=ns)
|
meta_cover = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='cover']/@content", namespaces=ns)
|
||||||
if len(meta_cover) > 0:
|
if len(meta_cover) > 0:
|
||||||
|
cover_section = tree.xpath(
|
||||||
|
"/pkg:package/pkg:manifest/pkg:item[@id='"+meta_cover[0]+"']/@href", namespaces=ns)
|
||||||
|
if not cover_section:
|
||||||
cover_section = tree.xpath(
|
cover_section = tree.xpath(
|
||||||
"/pkg:package/pkg:manifest/pkg:item[@id='"+meta_cover[0]+"']/@href", namespaces=ns)
|
"/pkg:package/pkg:manifest/pkg:item[@properties='" + meta_cover[0] + "']/@href", namespaces=ns)
|
||||||
if not cover_section:
|
else:
|
||||||
cover_section = tree.xpath(
|
cover_section = tree.xpath("/pkg:package/pkg:guide/pkg:reference/@href", namespaces=ns)
|
||||||
"/pkg:package/pkg:manifest/pkg:item[@properties='" + meta_cover[0] + "']/@href", namespaces=ns)
|
|
||||||
|
cover_file = None
|
||||||
|
for cs in cover_section:
|
||||||
|
if cs.endswith('.xhtml') or cs.endswith('.html'):
|
||||||
|
markup = epub_zip.read(os.path.join(cover_path, cs))
|
||||||
|
markup_tree = etree.fromstring(markup)
|
||||||
|
# no matter xhtml or html with no namespace
|
||||||
|
img_src = markup_tree.xpath("//*[local-name() = 'img']/@src")
|
||||||
|
# Alternative image source
|
||||||
|
if not len(img_src):
|
||||||
|
img_src = markup_tree.xpath("//attribute::*[contains(local-name(), 'href')]")
|
||||||
|
if len(img_src):
|
||||||
|
# img_src maybe start with "../"" so fullpath join then relpath to cwd
|
||||||
|
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(cover_path, cover_section[0])),
|
||||||
|
img_src[0]))
|
||||||
|
cover_file = _extract_cover(epub_zip, filename, "", tmp_file_path)
|
||||||
else:
|
else:
|
||||||
cover_section = tree.xpath("/pkg:package/pkg:guide/pkg:reference/@href", namespaces=ns)
|
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
||||||
for cs in cover_section:
|
if cover_file:
|
||||||
filetype = cs.rsplit('.', 1)[-1]
|
break
|
||||||
if filetype == "xhtml" or filetype == "html": # if cover is (x)html format
|
|
||||||
markup = epub_zip.read(os.path.join(cover_path, cs))
|
|
||||||
markup_tree = etree.fromstring(markup)
|
|
||||||
# no matter xhtml or html with no namespace
|
|
||||||
img_src = markup_tree.xpath("//*[local-name() = 'img']/@src")
|
|
||||||
# Alternative image source
|
|
||||||
if not len(img_src):
|
|
||||||
img_src = markup_tree.xpath("//attribute::*[contains(local-name(), 'href')]")
|
|
||||||
if len(img_src):
|
|
||||||
# img_src maybe start with "../"" so fullpath join then relpath to cwd
|
|
||||||
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(cover_path, cover_section[0])),
|
|
||||||
img_src[0]))
|
|
||||||
cover_file = _extract_cover(epub_zip, filename, "", tmp_file_path)
|
|
||||||
else:
|
|
||||||
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
|
||||||
if cover_file: break
|
|
||||||
return cover_file
|
return cover_file
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,166 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2018 lemmsh, Kennyl, Kyosfonica, matthazinski
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import zipfile
|
||||||
|
from lxml import etree
|
||||||
|
|
||||||
|
from . import isoLanguages
|
||||||
|
|
||||||
|
default_ns = {
|
||||||
|
'n': 'urn:oasis:names:tc:opendocument:xmlns:container',
|
||||||
|
'pkg': 'http://www.idpf.org/2007/opf',
|
||||||
|
}
|
||||||
|
|
||||||
|
OPF_NAMESPACE = "http://www.idpf.org/2007/opf"
|
||||||
|
PURL_NAMESPACE = "http://purl.org/dc/elements/1.1/"
|
||||||
|
|
||||||
|
OPF = "{%s}" % OPF_NAMESPACE
|
||||||
|
PURL = "{%s}" % PURL_NAMESPACE
|
||||||
|
|
||||||
|
etree.register_namespace("opf", OPF_NAMESPACE)
|
||||||
|
etree.register_namespace("dc", PURL_NAMESPACE)
|
||||||
|
|
||||||
|
OPF_NS = {None: OPF_NAMESPACE} # the default namespace (no prefix)
|
||||||
|
NSMAP = {'dc': PURL_NAMESPACE, 'opf': OPF_NAMESPACE}
|
||||||
|
|
||||||
|
|
||||||
|
def updateEpub(src, dest, filename, data, ):
|
||||||
|
# create a temp copy of the archive without filename
|
||||||
|
with zipfile.ZipFile(src, 'r') as zin:
|
||||||
|
with zipfile.ZipFile(dest, 'w') as zout:
|
||||||
|
zout.comment = zin.comment # preserve the comment
|
||||||
|
for item in zin.infolist():
|
||||||
|
if item.filename != filename:
|
||||||
|
zout.writestr(item, zin.read(item.filename))
|
||||||
|
|
||||||
|
# now add filename with its new data
|
||||||
|
with zipfile.ZipFile(dest, mode='a', compression=zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
zf.writestr(filename, data)
|
||||||
|
|
||||||
|
|
||||||
|
def get_content_opf(file_path, ns=default_ns):
|
||||||
|
epubZip = zipfile.ZipFile(file_path)
|
||||||
|
txt = epubZip.read('META-INF/container.xml')
|
||||||
|
tree = etree.fromstring(txt)
|
||||||
|
cf_name = tree.xpath('n:rootfiles/n:rootfile/@full-path', namespaces=ns)[0]
|
||||||
|
cf = epubZip.read(cf_name)
|
||||||
|
|
||||||
|
return etree.fromstring(cf), cf_name
|
||||||
|
|
||||||
|
|
||||||
|
def create_new_metadata_backup(book, custom_columns, export_language, translated_cover_name, lang_type=3):
|
||||||
|
# generate root package element
|
||||||
|
package = etree.Element(OPF + "package", nsmap=OPF_NS)
|
||||||
|
package.set("unique-identifier", "uuid_id")
|
||||||
|
package.set("version", "2.0")
|
||||||
|
|
||||||
|
# generate metadata element and all sub elements of it
|
||||||
|
metadata = etree.SubElement(package, "metadata", nsmap=NSMAP)
|
||||||
|
identifier = etree.SubElement(metadata, PURL + "identifier", id="calibre_id", nsmap=NSMAP)
|
||||||
|
identifier.set(OPF + "scheme", "calibre")
|
||||||
|
identifier.text = str(book.id)
|
||||||
|
identifier2 = etree.SubElement(metadata, PURL + "identifier", id="uuid_id", nsmap=NSMAP)
|
||||||
|
identifier2.set(OPF + "scheme", "uuid")
|
||||||
|
identifier2.text = book.uuid
|
||||||
|
for i in book.identifiers:
|
||||||
|
identifier = etree.SubElement(metadata, PURL + "identifier", nsmap=NSMAP)
|
||||||
|
identifier.set(OPF + "scheme", i.format_type())
|
||||||
|
identifier.text = str(i.val)
|
||||||
|
title = etree.SubElement(metadata, PURL + "title", nsmap=NSMAP)
|
||||||
|
title.text = book.title
|
||||||
|
for author in book.authors:
|
||||||
|
creator = etree.SubElement(metadata, PURL + "creator", nsmap=NSMAP)
|
||||||
|
creator.text = str(author.name)
|
||||||
|
creator.set(OPF + "file-as", book.author_sort) # ToDo Check
|
||||||
|
creator.set(OPF + "role", "aut")
|
||||||
|
contributor = etree.SubElement(metadata, PURL + "contributor", nsmap=NSMAP)
|
||||||
|
contributor.text = "calibre (5.7.2) [https://calibre-ebook.com]"
|
||||||
|
contributor.set(OPF + "file-as", "calibre") # ToDo Check
|
||||||
|
contributor.set(OPF + "role", "bkp")
|
||||||
|
|
||||||
|
date = etree.SubElement(metadata, PURL + "date", nsmap=NSMAP)
|
||||||
|
date.text = '{d.year:04}-{d.month:02}-{d.day:02}T{d.hour:02}:{d.minute:02}:{d.second:02}'.format(d=book.pubdate)
|
||||||
|
if book.comments and book.comments[0].text:
|
||||||
|
for b in book.comments:
|
||||||
|
description = etree.SubElement(metadata, PURL + "description", nsmap=NSMAP)
|
||||||
|
description.text = b.text
|
||||||
|
for b in book.publishers:
|
||||||
|
publisher = etree.SubElement(metadata, PURL + "publisher", nsmap=NSMAP)
|
||||||
|
publisher.text = str(b.name)
|
||||||
|
if not book.languages:
|
||||||
|
language = etree.SubElement(metadata, PURL + "language", nsmap=NSMAP)
|
||||||
|
language.text = export_language
|
||||||
|
else:
|
||||||
|
for b in book.languages:
|
||||||
|
language = etree.SubElement(metadata, PURL + "language", nsmap=NSMAP)
|
||||||
|
language.text = str(b.lang_code) if lang_type == 3 else isoLanguages.get(part3=b.lang_code).part1
|
||||||
|
for b in book.tags:
|
||||||
|
subject = etree.SubElement(metadata, PURL + "subject", nsmap=NSMAP)
|
||||||
|
subject.text = str(b.name)
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:author_link_map",
|
||||||
|
content="{" + ", ".join(['"' + str(a.name) + '": ""' for a in book.authors]) + "}",
|
||||||
|
nsmap=NSMAP)
|
||||||
|
for b in book.series:
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:series",
|
||||||
|
content=str(str(b.name)),
|
||||||
|
nsmap=NSMAP)
|
||||||
|
if book.series:
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:series_index",
|
||||||
|
content=str(book.series_index),
|
||||||
|
nsmap=NSMAP)
|
||||||
|
if len(book.ratings) and book.ratings[0].rating > 0:
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:rating",
|
||||||
|
content=str(book.ratings[0].rating),
|
||||||
|
nsmap=NSMAP)
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:timestamp",
|
||||||
|
content='{d.year:04}-{d.month:02}-{d.day:02}T{d.hour:02}:{d.minute:02}:{d.second:02}'.format(
|
||||||
|
d=book.timestamp),
|
||||||
|
nsmap=NSMAP)
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:title_sort",
|
||||||
|
content=book.sort,
|
||||||
|
nsmap=NSMAP)
|
||||||
|
sequence = 0
|
||||||
|
for cc in custom_columns:
|
||||||
|
value = None
|
||||||
|
extra = None
|
||||||
|
cc_entry = getattr(book, "custom_column_" + str(cc.id))
|
||||||
|
if cc_entry.__len__():
|
||||||
|
value = [c.value for c in cc_entry] if cc.is_multiple else cc_entry[0].value
|
||||||
|
extra = cc_entry[0].extra if hasattr(cc_entry[0], "extra") else None
|
||||||
|
etree.SubElement(metadata, "meta", name="calibre:user_metadata:#{}".format(cc.label),
|
||||||
|
content=cc.to_json(value, extra, sequence),
|
||||||
|
nsmap=NSMAP)
|
||||||
|
sequence += 1
|
||||||
|
|
||||||
|
# generate guide element and all sub elements of it
|
||||||
|
# Title is translated from default export language
|
||||||
|
guide = etree.SubElement(package, "guide")
|
||||||
|
etree.SubElement(guide, "reference", type="cover", title=translated_cover_name, href="cover.jpg")
|
||||||
|
|
||||||
|
return package
|
||||||
|
|
||||||
|
def replace_metadata(tree, package):
|
||||||
|
rep_element = tree.xpath('/pkg:package/pkg:metadata', namespaces=default_ns)[0]
|
||||||
|
new_element = package.xpath('//metadata', namespaces=default_ns)[0]
|
||||||
|
tree.replace(rep_element, new_element)
|
||||||
|
return etree.tostring(tree,
|
||||||
|
xml_declaration=True,
|
||||||
|
encoding='utf-8',
|
||||||
|
pretty_print=True).decode('utf-8')
|
||||||
|
|
||||||
|
|
14
cps/fb2.py
|
@ -38,19 +38,19 @@ def get_fb2_info(tmp_file_path, original_file_extension):
|
||||||
if len(last_name):
|
if len(last_name):
|
||||||
last_name = last_name[0]
|
last_name = last_name[0]
|
||||||
else:
|
else:
|
||||||
last_name = u''
|
last_name = ''
|
||||||
middle_name = element.xpath('fb:middle-name/text()', namespaces=ns)
|
middle_name = element.xpath('fb:middle-name/text()', namespaces=ns)
|
||||||
if len(middle_name):
|
if len(middle_name):
|
||||||
middle_name = middle_name[0]
|
middle_name = middle_name[0]
|
||||||
else:
|
else:
|
||||||
middle_name = u''
|
middle_name = ''
|
||||||
first_name = element.xpath('fb:first-name/text()', namespaces=ns)
|
first_name = element.xpath('fb:first-name/text()', namespaces=ns)
|
||||||
if len(first_name):
|
if len(first_name):
|
||||||
first_name = first_name[0]
|
first_name = first_name[0]
|
||||||
else:
|
else:
|
||||||
first_name = u''
|
first_name = ''
|
||||||
return (first_name + u' '
|
return (first_name + ' '
|
||||||
+ middle_name + u' '
|
+ middle_name + ' '
|
||||||
+ last_name)
|
+ last_name)
|
||||||
|
|
||||||
author = str(", ".join(map(get_author, authors)))
|
author = str(", ".join(map(get_author, authors)))
|
||||||
|
@ -59,12 +59,12 @@ def get_fb2_info(tmp_file_path, original_file_extension):
|
||||||
if len(title):
|
if len(title):
|
||||||
title = str(title[0])
|
title = str(title[0])
|
||||||
else:
|
else:
|
||||||
title = u''
|
title = ''
|
||||||
description = tree.xpath('/fb:FictionBook/fb:description/fb:publish-info/fb:book-name/text()', namespaces=ns)
|
description = tree.xpath('/fb:FictionBook/fb:description/fb:publish-info/fb:book-name/text()', namespaces=ns)
|
||||||
if len(description):
|
if len(description):
|
||||||
description = str(description[0])
|
description = str(description[0])
|
||||||
else:
|
else:
|
||||||
description = u''
|
description = ''
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
|
|
|
@ -0,0 +1,32 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2023 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from tempfile import gettempdir
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
def get_temp_dir():
|
||||||
|
tmp_dir = os.path.join(gettempdir(), 'calibre_web')
|
||||||
|
if not os.path.isdir(tmp_dir):
|
||||||
|
os.mkdir(tmp_dir)
|
||||||
|
return tmp_dir
|
||||||
|
|
||||||
|
|
||||||
|
def del_temp_dir():
|
||||||
|
tmp_dir = os.path.join(gettempdir(), 'calibre_web')
|
||||||
|
shutil.rmtree(tmp_dir)
|
|
@ -23,7 +23,6 @@
|
||||||
import os
|
import os
|
||||||
import hashlib
|
import hashlib
|
||||||
import json
|
import json
|
||||||
import tempfile
|
|
||||||
from uuid import uuid4
|
from uuid import uuid4
|
||||||
from time import time
|
from time import time
|
||||||
from shutil import move, copyfile
|
from shutil import move, copyfile
|
||||||
|
@ -34,6 +33,7 @@ from flask_login import login_required
|
||||||
|
|
||||||
from . import logger, gdriveutils, config, ub, calibre_db, csrf
|
from . import logger, gdriveutils, config, ub, calibre_db, csrf
|
||||||
from .admin import admin_required
|
from .admin import admin_required
|
||||||
|
from .file_helper import get_temp_dir
|
||||||
|
|
||||||
gdrive = Blueprint('gdrive', __name__, url_prefix='/gdrive')
|
gdrive = Blueprint('gdrive', __name__, url_prefix='/gdrive')
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -55,7 +55,7 @@ def authenticate_google_drive():
|
||||||
try:
|
try:
|
||||||
authUrl = gdriveutils.Gauth.Instance().auth.GetAuthUrl()
|
authUrl = gdriveutils.Gauth.Instance().auth.GetAuthUrl()
|
||||||
except gdriveutils.InvalidConfigError:
|
except gdriveutils.InvalidConfigError:
|
||||||
flash(_(u'Google Drive setup not completed, try to deactivate and activate Google Drive again'),
|
flash(_('Google Drive setup not completed, try to deactivate and activate Google Drive again'),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return redirect(authUrl)
|
return redirect(authUrl)
|
||||||
|
@ -91,9 +91,9 @@ def watch_gdrive():
|
||||||
config.save()
|
config.save()
|
||||||
except HttpError as e:
|
except HttpError as e:
|
||||||
reason=json.loads(e.content)['error']['errors'][0]
|
reason=json.loads(e.content)['error']['errors'][0]
|
||||||
if reason['reason'] == u'push.webhookUrlUnauthorized':
|
if reason['reason'] == 'push.webhookUrlUnauthorized':
|
||||||
flash(_(u'Callback domain is not verified, '
|
flash(_('Callback domain is not verified, '
|
||||||
u'please follow steps to verify domain in google developer console'), category="error")
|
'please follow steps to verify domain in google developer console'), category="error")
|
||||||
else:
|
else:
|
||||||
flash(reason['message'], category="error")
|
flash(reason['message'], category="error")
|
||||||
|
|
||||||
|
@ -139,9 +139,7 @@ try:
|
||||||
dbpath = os.path.join(config.config_calibre_dir, "metadata.db").encode()
|
dbpath = os.path.join(config.config_calibre_dir, "metadata.db").encode()
|
||||||
if not response['deleted'] and response['file']['title'] == 'metadata.db' \
|
if not response['deleted'] and response['file']['title'] == 'metadata.db' \
|
||||||
and response['file']['md5Checksum'] != hashlib.md5(dbpath): # nosec
|
and response['file']['md5Checksum'] != hashlib.md5(dbpath): # nosec
|
||||||
tmp_dir = os.path.join(tempfile.gettempdir(), 'calibre_web')
|
tmp_dir = get_temp_dir()
|
||||||
if not os.path.isdir(tmp_dir):
|
|
||||||
os.mkdir(tmp_dir)
|
|
||||||
|
|
||||||
log.info('Database file updated')
|
log.info('Database file updated')
|
||||||
copyfile(dbpath, os.path.join(tmp_dir, "metadata.db_" + str(current_milli_time())))
|
copyfile(dbpath, os.path.join(tmp_dir, "metadata.db_" + str(current_milli_time())))
|
||||||
|
|
|
@ -34,7 +34,6 @@ except ImportError:
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy.exc import OperationalError, InvalidRequestError, IntegrityError
|
from sqlalchemy.exc import OperationalError, InvalidRequestError, IntegrityError
|
||||||
from sqlalchemy.orm.exc import StaleDataError
|
from sqlalchemy.orm.exc import StaleDataError
|
||||||
from sqlalchemy.sql.expression import text
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from httplib2 import __version__ as httplib2_version
|
from httplib2 import __version__ as httplib2_version
|
||||||
|
@ -147,7 +146,7 @@ engine = create_engine('sqlite:///{0}'.format(cli_param.gd_path), echo=False)
|
||||||
Base = declarative_base()
|
Base = declarative_base()
|
||||||
|
|
||||||
# Open session for database connection
|
# Open session for database connection
|
||||||
Session = sessionmaker()
|
Session = sessionmaker(autoflush=False)
|
||||||
Session.configure(bind=engine)
|
Session.configure(bind=engine)
|
||||||
session = scoped_session(Session)
|
session = scoped_session(Session)
|
||||||
|
|
||||||
|
@ -174,30 +173,12 @@ class PermissionAdded(Base):
|
||||||
return str(self.gdrive_id)
|
return str(self.gdrive_id)
|
||||||
|
|
||||||
|
|
||||||
def migrate():
|
|
||||||
if not engine.dialect.has_table(engine.connect(), "permissions_added"):
|
|
||||||
PermissionAdded.__table__.create(bind = engine)
|
|
||||||
for sql in session.execute(text("select sql from sqlite_master where type='table'")):
|
|
||||||
if 'CREATE TABLE gdrive_ids' in sql[0]:
|
|
||||||
currUniqueConstraint = 'UNIQUE (gdrive_id)'
|
|
||||||
if currUniqueConstraint in sql[0]:
|
|
||||||
sql=sql[0].replace(currUniqueConstraint, 'UNIQUE (gdrive_id, path)')
|
|
||||||
sql=sql.replace(GdriveId.__tablename__, GdriveId.__tablename__ + '2')
|
|
||||||
session.execute(sql)
|
|
||||||
session.execute("INSERT INTO gdrive_ids2 (id, gdrive_id, path) SELECT id, "
|
|
||||||
"gdrive_id, path FROM gdrive_ids;")
|
|
||||||
session.commit()
|
|
||||||
session.execute('DROP TABLE %s' % 'gdrive_ids')
|
|
||||||
session.execute('ALTER TABLE gdrive_ids2 RENAME to gdrive_ids')
|
|
||||||
break
|
|
||||||
|
|
||||||
if not os.path.exists(cli_param.gd_path):
|
if not os.path.exists(cli_param.gd_path):
|
||||||
try:
|
try:
|
||||||
Base.metadata.create_all(engine)
|
Base.metadata.create_all(engine)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error("Error connect to database: {} - {}".format(cli_param.gd_path, ex))
|
log.error("Error connect to database: {} - {}".format(cli_param.gd_path, ex))
|
||||||
raise
|
raise
|
||||||
migrate()
|
|
||||||
|
|
||||||
|
|
||||||
def getDrive(drive=None, gauth=None):
|
def getDrive(drive=None, gauth=None):
|
||||||
|
@ -344,7 +325,7 @@ def getFileFromEbooksFolder(path, fileName):
|
||||||
|
|
||||||
|
|
||||||
def moveGdriveFileRemote(origin_file_id, new_title):
|
def moveGdriveFileRemote(origin_file_id, new_title):
|
||||||
origin_file_id['title']= new_title
|
origin_file_id['title'] = new_title
|
||||||
origin_file_id.Upload()
|
origin_file_id.Upload()
|
||||||
|
|
||||||
|
|
||||||
|
@ -422,7 +403,7 @@ def copyToDrive(drive, uploadFile, createRoot, replaceFiles,
|
||||||
driveFile.Upload()
|
driveFile.Upload()
|
||||||
|
|
||||||
|
|
||||||
def uploadFileToEbooksFolder(destFile, f):
|
def uploadFileToEbooksFolder(destFile, f, string=False):
|
||||||
drive = getDrive(Gdrive.Instance().drive)
|
drive = getDrive(Gdrive.Instance().drive)
|
||||||
parent = getEbooksFolder(drive)
|
parent = getEbooksFolder(drive)
|
||||||
splitDir = destFile.split('/')
|
splitDir = destFile.split('/')
|
||||||
|
@ -435,7 +416,10 @@ def uploadFileToEbooksFolder(destFile, f):
|
||||||
else:
|
else:
|
||||||
driveFile = drive.CreateFile({'title': x,
|
driveFile = drive.CreateFile({'title': x,
|
||||||
'parents': [{"kind": "drive#fileLink", 'id': parent['id']}], })
|
'parents': [{"kind": "drive#fileLink", 'id': parent['id']}], })
|
||||||
driveFile.SetContentFile(f)
|
if not string:
|
||||||
|
driveFile.SetContentFile(f)
|
||||||
|
else:
|
||||||
|
driveFile.SetContentString(f)
|
||||||
driveFile.Upload()
|
driveFile.Upload()
|
||||||
else:
|
else:
|
||||||
existing_Folder = drive.ListFile({'q': "title = '%s' and '%s' in parents and trashed = false" %
|
existing_Folder = drive.ListFile({'q': "title = '%s' and '%s' in parents and trashed = false" %
|
||||||
|
@ -556,7 +540,7 @@ def updateGdriveCalibreFromLocal():
|
||||||
|
|
||||||
# update gdrive.db on edit of books title
|
# update gdrive.db on edit of books title
|
||||||
def updateDatabaseOnEdit(ID,newPath):
|
def updateDatabaseOnEdit(ID,newPath):
|
||||||
sqlCheckPath = newPath if newPath[-1] == '/' else newPath + u'/'
|
sqlCheckPath = newPath if newPath[-1] == '/' else newPath + '/'
|
||||||
storedPathName = session.query(GdriveId).filter(GdriveId.gdrive_id == ID).first()
|
storedPathName = session.query(GdriveId).filter(GdriveId.gdrive_id == ID).first()
|
||||||
if storedPathName:
|
if storedPathName:
|
||||||
storedPathName.path = sqlCheckPath
|
storedPathName.path = sqlCheckPath
|
||||||
|
@ -578,6 +562,7 @@ def deleteDatabaseEntry(ID):
|
||||||
|
|
||||||
|
|
||||||
# Gets cover file from gdrive
|
# Gets cover file from gdrive
|
||||||
|
# ToDo: Check is this right everyone get read permissions on cover files?
|
||||||
def get_cover_via_gdrive(cover_path):
|
def get_cover_via_gdrive(cover_path):
|
||||||
df = getFileFromEbooksFolder(cover_path, 'cover.jpg')
|
df = getFileFromEbooksFolder(cover_path, 'cover.jpg')
|
||||||
if df:
|
if df:
|
||||||
|
@ -600,6 +585,29 @@ def get_cover_via_gdrive(cover_path):
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
# Gets cover file from gdrive
|
||||||
|
def get_metadata_backup_via_gdrive(metadata_path):
|
||||||
|
df = getFileFromEbooksFolder(metadata_path, 'metadata.opf')
|
||||||
|
if df:
|
||||||
|
if not session.query(PermissionAdded).filter(PermissionAdded.gdrive_id == df['id']).first():
|
||||||
|
df.GetPermissions()
|
||||||
|
df.InsertPermission({
|
||||||
|
'type': 'anyone',
|
||||||
|
'value': 'anyone',
|
||||||
|
'role': 'writer', # ToDo needs write access
|
||||||
|
'withLink': True})
|
||||||
|
permissionAdded = PermissionAdded()
|
||||||
|
permissionAdded.gdrive_id = df['id']
|
||||||
|
session.add(permissionAdded)
|
||||||
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
|
session.rollback()
|
||||||
|
return df.metadata.get('webContentLink')
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
# Creates chunks for downloading big files
|
# Creates chunks for downloading big files
|
||||||
def partial(total_byte_len, part_size_limit):
|
def partial(total_byte_len, part_size_limit):
|
||||||
s = []
|
s = []
|
||||||
|
|
|
@ -18,20 +18,22 @@
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
import random
|
||||||
import io
|
import io
|
||||||
import sys
|
|
||||||
import mimetypes
|
import mimetypes
|
||||||
import re
|
import re
|
||||||
|
import regex
|
||||||
import shutil
|
import shutil
|
||||||
import socket
|
import socket
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from tempfile import gettempdir
|
|
||||||
import requests
|
import requests
|
||||||
import unidecode
|
import unidecode
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
from flask import send_from_directory, make_response, redirect, abort, url_for
|
from flask import send_from_directory, make_response, redirect, abort, url_for
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_babel import lazy_gettext as N_
|
from flask_babel import lazy_gettext as N_
|
||||||
|
from flask_babel import get_locale
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import true, false, and_, or_, text, func
|
from sqlalchemy.sql.expression import true, false, and_, or_, text, func
|
||||||
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
||||||
|
@ -53,11 +55,16 @@ from . import calibre_db, cli_param
|
||||||
from .tasks.convert import TaskConvert
|
from .tasks.convert import TaskConvert
|
||||||
from . import logger, config, db, ub, fs
|
from . import logger, config, db, ub, fs
|
||||||
from . import gdriveutils as gd
|
from . import gdriveutils as gd
|
||||||
from .constants import STATIC_DIR as _STATIC_DIR, CACHE_TYPE_THUMBNAILS, THUMBNAIL_TYPE_COVER, THUMBNAIL_TYPE_SERIES
|
from .constants import (STATIC_DIR as _STATIC_DIR, CACHE_TYPE_THUMBNAILS, THUMBNAIL_TYPE_COVER, THUMBNAIL_TYPE_SERIES,
|
||||||
|
SUPPORTED_CALIBRE_BINARIES)
|
||||||
from .subproc_wrapper import process_wait
|
from .subproc_wrapper import process_wait
|
||||||
from .services.worker import WorkerThread
|
from .services.worker import WorkerThread
|
||||||
from .tasks.mail import TaskEmail
|
from .tasks.mail import TaskEmail
|
||||||
from .tasks.thumbnail import TaskClearCoverThumbnailCache, TaskGenerateCoverThumbnails
|
from .tasks.thumbnail import TaskClearCoverThumbnailCache, TaskGenerateCoverThumbnails
|
||||||
|
from .tasks.metadata_backup import TaskBackupMetadata
|
||||||
|
from .file_helper import get_temp_dir
|
||||||
|
from .epub_helper import get_content_opf, create_new_metadata_backup, updateEpub, replace_metadata
|
||||||
|
from .embed_helper import do_calibre_export
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -76,29 +83,29 @@ def convert_book_format(book_id, calibre_path, old_book_format, new_book_format,
|
||||||
book = calibre_db.get_book(book_id)
|
book = calibre_db.get_book(book_id)
|
||||||
data = calibre_db.get_book_format(book.id, old_book_format)
|
data = calibre_db.get_book_format(book.id, old_book_format)
|
||||||
if not data:
|
if not data:
|
||||||
error_message = _(u"%(format)s format not found for book id: %(book)d", format=old_book_format, book=book_id)
|
error_message = _("%(format)s format not found for book id: %(book)d", format=old_book_format, book=book_id)
|
||||||
log.error("convert_book_format: %s", error_message)
|
log.error("convert_book_format: %s", error_message)
|
||||||
return error_message
|
return error_message
|
||||||
file_path = os.path.join(calibre_path, book.path, data.name)
|
file_path = os.path.join(calibre_path, book.path, data.name)
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
if not gd.getFileFromEbooksFolder(book.path, data.name + "." + old_book_format.lower()):
|
if not gd.getFileFromEbooksFolder(book.path, data.name + "." + old_book_format.lower()):
|
||||||
error_message = _(u"%(format)s not found on Google Drive: %(fn)s",
|
error_message = _("%(format)s not found on Google Drive: %(fn)s",
|
||||||
format=old_book_format, fn=data.name + "." + old_book_format.lower())
|
format=old_book_format, fn=data.name + "." + old_book_format.lower())
|
||||||
return error_message
|
return error_message
|
||||||
else:
|
else:
|
||||||
if not os.path.exists(file_path + "." + old_book_format.lower()):
|
if not os.path.exists(file_path + "." + old_book_format.lower()):
|
||||||
error_message = _(u"%(format)s not found: %(fn)s",
|
error_message = _("%(format)s not found: %(fn)s",
|
||||||
format=old_book_format, fn=data.name + "." + old_book_format.lower())
|
format=old_book_format, fn=data.name + "." + old_book_format.lower())
|
||||||
return error_message
|
return error_message
|
||||||
# read settings and append converter task to queue
|
# read settings and append converter task to queue
|
||||||
if ereader_mail:
|
if ereader_mail:
|
||||||
settings = config.get_mail_settings()
|
settings = config.get_mail_settings()
|
||||||
settings['subject'] = _('Send to E-Reader') # pretranslate Subject for e-mail
|
settings['subject'] = _('Send to eReader') # pretranslate Subject for Email
|
||||||
settings['body'] = _(u'This e-mail has been sent via Calibre-Web.')
|
settings['body'] = _('This Email has been sent via Calibre-Web.')
|
||||||
else:
|
else:
|
||||||
settings = dict()
|
settings = dict()
|
||||||
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book.id), escape(book.title)) # prevent xss
|
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book.id), escape(book.title)) # prevent xss
|
||||||
txt = u"{} -> {}: {}".format(
|
txt = "{} -> {}: {}".format(
|
||||||
old_book_format.upper(),
|
old_book_format.upper(),
|
||||||
new_book_format.upper(),
|
new_book_format.upper(),
|
||||||
link)
|
link)
|
||||||
|
@ -110,30 +117,30 @@ def convert_book_format(book_id, calibre_path, old_book_format, new_book_format,
|
||||||
|
|
||||||
# Texts are not lazy translated as they are supposed to get send out as is
|
# Texts are not lazy translated as they are supposed to get send out as is
|
||||||
def send_test_mail(ereader_mail, user_name):
|
def send_test_mail(ereader_mail, user_name):
|
||||||
WorkerThread.add(user_name, TaskEmail(_(u'Calibre-Web test e-mail'), None, None,
|
WorkerThread.add(user_name, TaskEmail(_('Calibre-Web Test Email'), None, None,
|
||||||
config.get_mail_settings(), ereader_mail, N_(u"Test e-mail"),
|
config.get_mail_settings(), ereader_mail, N_("Test Email"),
|
||||||
_(u'This e-mail has been sent via Calibre-Web.')))
|
_('This Email has been sent via Calibre-Web.')))
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|
||||||
# Send registration email or password reset email, depending on parameter resend (False means welcome email)
|
# Send registration email or password reset email, depending on parameter resend (False means welcome email)
|
||||||
def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
||||||
txt = "Hello %s!\r\n" % user_name
|
txt = "Hi %s!\r\n" % user_name
|
||||||
if not resend:
|
if not resend:
|
||||||
txt += "Your new account at Calibre-Web has been created. Thanks for joining us!\r\n"
|
txt += "Your account at Calibre-Web has been created.\r\n"
|
||||||
txt += "Please log in to your account using the following information:\r\n"
|
txt += "Please log in using the following information:\r\n"
|
||||||
txt += "User name: %s\r\n" % user_name
|
txt += "Username: %s\r\n" % user_name
|
||||||
txt += "Password: %s\r\n" % default_password
|
txt += "Password: %s\r\n" % default_password
|
||||||
txt += "Don't forget to change your password after first login.\r\n"
|
txt += "Don't forget to change your password after your first login.\r\n"
|
||||||
txt += "Sincerely\r\n\r\n"
|
txt += "Regards,\r\n\r\n"
|
||||||
txt += "Your Calibre-Web team"
|
txt += "Calibre-Web"
|
||||||
WorkerThread.add(None, TaskEmail(
|
WorkerThread.add(None, TaskEmail(
|
||||||
subject=_(u'Get Started with Calibre-Web'),
|
subject=_('Get Started with Calibre-Web'),
|
||||||
filepath=None,
|
filepath=None,
|
||||||
attachment=None,
|
attachment=None,
|
||||||
settings=config.get_mail_settings(),
|
settings=config.get_mail_settings(),
|
||||||
recipient=e_mail,
|
recipient=e_mail,
|
||||||
task_message=N_(u"Registration e-mail for user: %(name)s", name=user_name),
|
task_message=N_("Registration Email for user: %(name)s", name=user_name),
|
||||||
text=txt
|
text=txt
|
||||||
))
|
))
|
||||||
return
|
return
|
||||||
|
@ -144,13 +151,13 @@ def check_send_to_ereader_with_converter(formats):
|
||||||
if 'MOBI' in formats and 'EPUB' not in formats:
|
if 'MOBI' in formats and 'EPUB' not in formats:
|
||||||
book_formats.append({'format': 'Epub',
|
book_formats.append({'format': 'Epub',
|
||||||
'convert': 1,
|
'convert': 1,
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to E-Reader',
|
'text': _('Convert %(orig)s to %(format)s and send to eReader',
|
||||||
orig='Mobi',
|
orig='Mobi',
|
||||||
format='Epub')})
|
format='Epub')})
|
||||||
if 'AZW3' in formats and 'EPUB' not in formats:
|
if 'AZW3' in formats and 'EPUB' not in formats:
|
||||||
book_formats.append({'format': 'Epub',
|
book_formats.append({'format': 'Epub',
|
||||||
'convert': 2,
|
'convert': 2,
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to E-Reader',
|
'text': _('Convert %(orig)s to %(format)s and send to eReader',
|
||||||
orig='Azw3',
|
orig='Azw3',
|
||||||
format='Epub')})
|
format='Epub')})
|
||||||
return book_formats
|
return book_formats
|
||||||
|
@ -158,7 +165,7 @@ def check_send_to_ereader_with_converter(formats):
|
||||||
|
|
||||||
def check_send_to_ereader(entry):
|
def check_send_to_ereader(entry):
|
||||||
"""
|
"""
|
||||||
returns all available book formats for sending to E-Reader
|
returns all available book formats for sending to eReader
|
||||||
"""
|
"""
|
||||||
formats = list()
|
formats = list()
|
||||||
book_formats = list()
|
book_formats = list()
|
||||||
|
@ -169,31 +176,27 @@ def check_send_to_ereader(entry):
|
||||||
if 'EPUB' in formats:
|
if 'EPUB' in formats:
|
||||||
book_formats.append({'format': 'Epub',
|
book_formats.append({'format': 'Epub',
|
||||||
'convert': 0,
|
'convert': 0,
|
||||||
'text': _('Send %(format)s to E-Reader', format='Epub')})
|
'text': _('Send %(format)s to eReader', format='Epub')})
|
||||||
if 'MOBI' in formats:
|
|
||||||
book_formats.append({'format': 'Mobi',
|
|
||||||
'convert': 0,
|
|
||||||
'text': _('Send %(format)s to E-Reader', format='Mobi')})
|
|
||||||
if 'PDF' in formats:
|
if 'PDF' in formats:
|
||||||
book_formats.append({'format': 'Pdf',
|
book_formats.append({'format': 'Pdf',
|
||||||
'convert': 0,
|
'convert': 0,
|
||||||
'text': _('Send %(format)s to E-Reader', format='Pdf')})
|
'text': _('Send %(format)s to eReader', format='Pdf')})
|
||||||
if 'AZW' in formats:
|
if 'AZW' in formats:
|
||||||
book_formats.append({'format': 'Azw',
|
book_formats.append({'format': 'Azw',
|
||||||
'convert': 0,
|
'convert': 0,
|
||||||
'text': _('Send %(format)s to E-Reader', format='Azw')})
|
'text': _('Send %(format)s to eReader', format='Azw')})
|
||||||
if config.config_converterpath:
|
if config.config_converterpath:
|
||||||
book_formats.extend(check_send_to_ereader_with_converter(formats))
|
book_formats.extend(check_send_to_ereader_with_converter(formats))
|
||||||
return book_formats
|
return book_formats
|
||||||
else:
|
else:
|
||||||
log.error(u'Cannot find book entry %d', entry.id)
|
log.error('Cannot find book entry %d', entry.id)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
||||||
# list with supported formats
|
# list with supported formats
|
||||||
def check_read_formats(entry):
|
def check_read_formats(entry):
|
||||||
extensions_reader = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR', 'DJVU'}
|
extensions_reader = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR', 'DJVU', 'DJV'}
|
||||||
book_formats = list()
|
book_formats = list()
|
||||||
if len(entry.data):
|
if len(entry.data):
|
||||||
for ele in iter(entry.data):
|
for ele in iter(entry.data):
|
||||||
|
@ -203,30 +206,30 @@ def check_read_formats(entry):
|
||||||
|
|
||||||
|
|
||||||
# Files are processed in the following order/priority:
|
# Files are processed in the following order/priority:
|
||||||
# 1: If Mobi file is existing, it's directly send to E-Reader email,
|
# 1: If epub file is existing, it's directly send to eReader email,
|
||||||
# 2: If Epub file is existing, it's converted and send to E-Reader email,
|
# 2: If mobi file is existing, it's converted and send to eReader email,
|
||||||
# 3: If Pdf file is existing, it's directly send to E-Reader email
|
# 3: If Pdf file is existing, it's directly send to eReader email
|
||||||
def send_mail(book_id, book_format, convert, ereader_mail, calibrepath, user_id):
|
def send_mail(book_id, book_format, convert, ereader_mail, calibrepath, user_id):
|
||||||
"""Send email with attachments"""
|
"""Send email with attachments"""
|
||||||
book = calibre_db.get_book(book_id)
|
book = calibre_db.get_book(book_id)
|
||||||
|
|
||||||
if convert == 1:
|
if convert == 1:
|
||||||
# returns None if success, otherwise errormessage
|
# returns None if success, otherwise errormessage
|
||||||
return convert_book_format(book_id, calibrepath, u'epub', book_format.lower(), user_id, ereader_mail)
|
return convert_book_format(book_id, calibrepath, 'mobi', book_format.lower(), user_id, ereader_mail)
|
||||||
if convert == 2:
|
if convert == 2:
|
||||||
# returns None if success, otherwise errormessage
|
# returns None if success, otherwise errormessage
|
||||||
return convert_book_format(book_id, calibrepath, u'azw3', book_format.lower(), user_id, ereader_mail)
|
return convert_book_format(book_id, calibrepath, 'azw3', book_format.lower(), user_id, ereader_mail)
|
||||||
|
|
||||||
for entry in iter(book.data):
|
for entry in iter(book.data):
|
||||||
if entry.format.upper() == book_format.upper():
|
if entry.format.upper() == book_format.upper():
|
||||||
converted_file_name = entry.name + '.' + book_format.lower()
|
converted_file_name = entry.name + '.' + book_format.lower()
|
||||||
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(book.title))
|
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(book.title))
|
||||||
email_text = N_(u"%(book)s send to E-Reader", book=link)
|
email_text = N_("%(book)s send to eReader", book=link)
|
||||||
WorkerThread.add(user_id, TaskEmail(_(u"Send to E-Reader"), book.path, converted_file_name,
|
WorkerThread.add(user_id, TaskEmail(_("Send to eReader"), book.path, converted_file_name,
|
||||||
config.get_mail_settings(), ereader_mail,
|
config.get_mail_settings(), ereader_mail,
|
||||||
email_text, _(u'This e-mail has been sent via Calibre-Web.')))
|
email_text, _('This Email has been sent via Calibre-Web.'),book.id))
|
||||||
return
|
return
|
||||||
return _(u"The requested file could not be read. Maybe wrong permissions?")
|
return _("The requested file could not be read. Maybe wrong permissions?")
|
||||||
|
|
||||||
|
|
||||||
def get_valid_filename(value, replace_whitespace=True, chars=128):
|
def get_valid_filename(value, replace_whitespace=True, chars=128):
|
||||||
|
@ -234,16 +237,16 @@ def get_valid_filename(value, replace_whitespace=True, chars=128):
|
||||||
Returns the given string converted to a string that can be used for a clean
|
Returns the given string converted to a string that can be used for a clean
|
||||||
filename. Limits num characters to 128 max.
|
filename. Limits num characters to 128 max.
|
||||||
"""
|
"""
|
||||||
if value[-1:] == u'.':
|
if value[-1:] == '.':
|
||||||
value = value[:-1]+u'_'
|
value = value[:-1]+'_'
|
||||||
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
||||||
if config.config_unicode_filename:
|
if config.config_unicode_filename:
|
||||||
value = (unidecode.unidecode(value))
|
value = (unidecode.unidecode(value))
|
||||||
if replace_whitespace:
|
if replace_whitespace:
|
||||||
# *+:\"/<>? are replaced by _
|
# *+:\"/<>? are replaced by _
|
||||||
value = re.sub(r'[*+:\\\"/<>?]+', u'_', value, flags=re.U)
|
value = re.sub(r'[*+:\\\"/<>?]+', '_', value, flags=re.U)
|
||||||
# pipe has to be replaced with comma
|
# pipe has to be replaced with comma
|
||||||
value = re.sub(r'[|]+', u',', value, flags=re.U)
|
value = re.sub(r'[|]+', ',', value, flags=re.U)
|
||||||
|
|
||||||
value = value.encode('utf-8')[:chars].decode('utf-8', errors='ignore').strip()
|
value = value.encode('utf-8')[:chars].decode('utf-8', errors='ignore').strip()
|
||||||
|
|
||||||
|
@ -340,7 +343,7 @@ def edit_book_read_status(book_id, read_status=None):
|
||||||
return "Custom Column No.{} does not exist in calibre database".format(config.config_read_column)
|
return "Custom Column No.{} does not exist in calibre database".format(config.config_read_column)
|
||||||
except (OperationalError, InvalidRequestError) as ex:
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error(u"Read status could not set: {}".format(ex))
|
log.error("Read status could not set: {}".format(ex))
|
||||||
return _("Read status could not set: {}".format(ex.orig))
|
return _("Read status could not set: {}".format(ex.orig))
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
@ -415,8 +418,8 @@ def clean_author_database(renamed_author, calibre_path="", local_book=None, gdri
|
||||||
g_file = gd.getFileFromEbooksFolder(all_new_path,
|
g_file = gd.getFileFromEbooksFolder(all_new_path,
|
||||||
file_format.name + '.' + file_format.format.lower())
|
file_format.name + '.' + file_format.format.lower())
|
||||||
if g_file:
|
if g_file:
|
||||||
gd.moveGdriveFileRemote(g_file, all_new_name + u'.' + file_format.format.lower())
|
gd.moveGdriveFileRemote(g_file, all_new_name + '.' + file_format.format.lower())
|
||||||
gd.updateDatabaseOnEdit(g_file['id'], all_new_name + u'.' + file_format.format.lower())
|
gd.updateDatabaseOnEdit(g_file['id'], all_new_name + '.' + file_format.format.lower())
|
||||||
else:
|
else:
|
||||||
log.error("File {} not found on gdrive"
|
log.error("File {} not found on gdrive"
|
||||||
.format(all_new_path, file_format.name + '.' + file_format.format.lower()))
|
.format(all_new_path, file_format.name + '.' + file_format.format.lower()))
|
||||||
|
@ -509,25 +512,25 @@ def update_dir_structure_gdrive(book_id, first_author, renamed_author):
|
||||||
authordir = book.path.split('/')[0]
|
authordir = book.path.split('/')[0]
|
||||||
titledir = book.path.split('/')[1]
|
titledir = book.path.split('/')[1]
|
||||||
new_authordir = rename_all_authors(first_author, renamed_author, gdrive=True)
|
new_authordir = rename_all_authors(first_author, renamed_author, gdrive=True)
|
||||||
new_titledir = get_valid_filename(book.title, chars=96) + u" (" + str(book_id) + u")"
|
new_titledir = get_valid_filename(book.title, chars=96) + " (" + str(book_id) + ")"
|
||||||
|
|
||||||
if titledir != new_titledir:
|
if titledir != new_titledir:
|
||||||
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), titledir)
|
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), titledir)
|
||||||
if g_file:
|
if g_file:
|
||||||
gd.moveGdriveFileRemote(g_file, new_titledir)
|
gd.moveGdriveFileRemote(g_file, new_titledir)
|
||||||
book.path = book.path.split('/')[0] + u'/' + new_titledir
|
book.path = book.path.split('/')[0] + '/' + new_titledir
|
||||||
gd.updateDatabaseOnEdit(g_file['id'], book.path) # only child folder affected
|
gd.updateDatabaseOnEdit(g_file['id'], book.path) # only child folder affected
|
||||||
else:
|
else:
|
||||||
return _(u'File %(file)s not found on Google Drive', file=book.path) # file not found
|
return _('File %(file)s not found on Google Drive', file=book.path) # file not found
|
||||||
|
|
||||||
if authordir != new_authordir and authordir not in renamed_author:
|
if authordir != new_authordir and authordir not in renamed_author:
|
||||||
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), new_titledir)
|
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), new_titledir)
|
||||||
if g_file:
|
if g_file:
|
||||||
gd.moveGdriveFolderRemote(g_file, new_authordir)
|
gd.moveGdriveFolderRemote(g_file, new_authordir)
|
||||||
book.path = new_authordir + u'/' + book.path.split('/')[1]
|
book.path = new_authordir + '/' + book.path.split('/')[1]
|
||||||
gd.updateDatabaseOnEdit(g_file['id'], book.path)
|
gd.updateDatabaseOnEdit(g_file['id'], book.path)
|
||||||
else:
|
else:
|
||||||
return _(u'File %(file)s not found on Google Drive', file=authordir) # file not found
|
return _('File %(file)s not found on Google Drive', file=authordir) # file not found
|
||||||
|
|
||||||
# change location in database to new author/title path
|
# change location in database to new author/title path
|
||||||
book.path = os.path.join(new_authordir, new_titledir).replace('\\', '/')
|
book.path = os.path.join(new_authordir, new_titledir).replace('\\', '/')
|
||||||
|
@ -599,7 +602,7 @@ def delete_book_gdrive(book, book_format):
|
||||||
gd.deleteDatabaseEntry(g_file['id'])
|
gd.deleteDatabaseEntry(g_file['id'])
|
||||||
g_file.Trash()
|
g_file.Trash()
|
||||||
else:
|
else:
|
||||||
error = _(u'Book path %(path)s not found on Google Drive', path=book.path) # file not found
|
error = _('Book path %(path)s not found on Google Drive', path=book.path) # file not found
|
||||||
|
|
||||||
return error is None, error
|
return error is None, error
|
||||||
|
|
||||||
|
@ -611,7 +614,7 @@ def reset_password(user_id):
|
||||||
if not config.get_mail_server_configured():
|
if not config.get_mail_server_configured():
|
||||||
return 2, None
|
return 2, None
|
||||||
try:
|
try:
|
||||||
password = generate_random_password()
|
password = generate_random_password(config.config_password_min_length)
|
||||||
existing_user.password = generate_password_hash(password)
|
existing_user.password = generate_password_hash(password)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
send_registration_mail(existing_user.email, existing_user.name, password, True)
|
send_registration_mail(existing_user.email, existing_user.name, password, True)
|
||||||
|
@ -620,11 +623,35 @@ def reset_password(user_id):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
return 0, None
|
return 0, None
|
||||||
|
|
||||||
|
def generate_random_password(min_length):
|
||||||
|
min_length = max(8, min_length) - 4
|
||||||
|
random_source = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
||||||
|
# select 1 lowercase
|
||||||
|
s = "abcdefghijklmnopqrstuvwxyz"
|
||||||
|
password = [s[c % len(s)] for c in os.urandom(1)]
|
||||||
|
# select 1 uppercase
|
||||||
|
s = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
|
||||||
|
password.extend([s[c % len(s)] for c in os.urandom(1)])
|
||||||
|
# select 1 digit
|
||||||
|
s = "01234567890"
|
||||||
|
password.extend([s[c % len(s)] for c in os.urandom(1)])
|
||||||
|
# select 1 special symbol
|
||||||
|
s = "!@#$%&*()?"
|
||||||
|
password.extend([s[c % len(s)] for c in os.urandom(1)])
|
||||||
|
|
||||||
def generate_random_password():
|
# generate other characters
|
||||||
|
password.extend([random_source[c % len(random_source)] for c in os.urandom(min_length)])
|
||||||
|
|
||||||
|
# password_list = list(password)
|
||||||
|
# shuffle all characters
|
||||||
|
random.SystemRandom().shuffle(password)
|
||||||
|
return ''.join(password)
|
||||||
|
|
||||||
|
|
||||||
|
'''def generate_random_password(min_length):
|
||||||
s = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
s = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
||||||
passlen = 8
|
passlen = min_length
|
||||||
return "".join(s[c % len(s)] for c in os.urandom(passlen))
|
return "".join(s[c % len(s)] for c in os.urandom(passlen))'''
|
||||||
|
|
||||||
|
|
||||||
def uniq(inpt):
|
def uniq(inpt):
|
||||||
|
@ -639,28 +666,49 @@ def uniq(inpt):
|
||||||
def check_email(email):
|
def check_email(email):
|
||||||
email = valid_email(email)
|
email = valid_email(email)
|
||||||
if ub.session.query(ub.User).filter(func.lower(ub.User.email) == email.lower()).first():
|
if ub.session.query(ub.User).filter(func.lower(ub.User.email) == email.lower()).first():
|
||||||
log.error(u"Found an existing account for this e-mail address")
|
log.error("Found an existing account for this Email address")
|
||||||
raise Exception(_(u"Found an existing account for this e-mail address"))
|
raise Exception(_("Found an existing account for this Email address"))
|
||||||
return email
|
return email
|
||||||
|
|
||||||
|
|
||||||
def check_username(username):
|
def check_username(username):
|
||||||
username = username.strip()
|
username = username.strip()
|
||||||
if ub.session.query(ub.User).filter(func.lower(ub.User.name) == username.lower()).scalar():
|
if ub.session.query(ub.User).filter(func.lower(ub.User.name) == username.lower()).scalar():
|
||||||
log.error(u"This username is already taken")
|
log.error("This username is already taken")
|
||||||
raise Exception(_(u"This username is already taken"))
|
raise Exception(_("This username is already taken"))
|
||||||
return username
|
return username
|
||||||
|
|
||||||
|
|
||||||
def valid_email(email):
|
def valid_email(email):
|
||||||
email = email.strip()
|
email = email.strip()
|
||||||
# Regex according to https://developer.mozilla.org/en-US/docs/Web/HTML/Element/input/email#validation
|
# if email is not deleted
|
||||||
if not re.search(r"^[\w.!#$%&'*+\\/=?^_`{|}~-]+@[\w](?:[\w-]{0,61}[\w])?(?:\.[\w](?:[\w-]{0,61}[\w])?)*$",
|
if email:
|
||||||
email):
|
# Regex according to https://developer.mozilla.org/en-US/docs/Web/HTML/Element/input/email#validation
|
||||||
log.error(u"Invalid e-mail address format")
|
if not re.search(r"^[\w.!#$%&'*+\\/=?^_`{|}~-]+@[\w](?:[\w-]{0,61}[\w])?(?:\.[\w](?:[\w-]{0,61}[\w])?)*$",
|
||||||
raise Exception(_(u"Invalid e-mail address format"))
|
email):
|
||||||
|
log.error("Invalid Email address format")
|
||||||
|
raise Exception(_("Invalid Email address format"))
|
||||||
return email
|
return email
|
||||||
|
|
||||||
|
def valid_password(check_password):
|
||||||
|
if config.config_password_policy:
|
||||||
|
verify = ""
|
||||||
|
if config.config_password_min_length > 0:
|
||||||
|
verify += r"^(?=.{" + str(config.config_password_min_length) + ",}$)"
|
||||||
|
if config.config_password_number:
|
||||||
|
verify += r"(?=.*?\d)"
|
||||||
|
if config.config_password_lower:
|
||||||
|
verify += r"(?=.*?[\p{Ll}])"
|
||||||
|
if config.config_password_upper:
|
||||||
|
verify += r"(?=.*?[\p{Lu}])"
|
||||||
|
if config.config_password_character:
|
||||||
|
verify += r"(?=.*?[\p{Letter}])"
|
||||||
|
if config.config_password_special:
|
||||||
|
verify += r"(?=.*?[^\p{Letter}\s0-9])"
|
||||||
|
match = regex.match(verify, check_password)
|
||||||
|
if not match:
|
||||||
|
raise Exception(_("Password doesn't comply with password validation rules"))
|
||||||
|
return check_password
|
||||||
# ################################# External interface #################################
|
# ################################# External interface #################################
|
||||||
|
|
||||||
|
|
||||||
|
@ -683,35 +731,35 @@ def update_dir_structure(book_id,
|
||||||
|
|
||||||
def delete_book(book, calibrepath, book_format):
|
def delete_book(book, calibrepath, book_format):
|
||||||
if not book_format:
|
if not book_format:
|
||||||
clear_cover_thumbnail_cache(book.id) ## here it breaks
|
clear_cover_thumbnail_cache(book.id) ## here it breaks
|
||||||
|
calibre_db.delete_dirty_metadata(book.id)
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
return delete_book_gdrive(book, book_format)
|
return delete_book_gdrive(book, book_format)
|
||||||
else:
|
else:
|
||||||
return delete_book_file(book, calibrepath, book_format)
|
return delete_book_file(book, calibrepath, book_format)
|
||||||
|
|
||||||
|
|
||||||
def get_cover_on_failure(use_generic_cover):
|
def get_cover_on_failure():
|
||||||
if use_generic_cover:
|
try:
|
||||||
try:
|
return send_from_directory(_STATIC_DIR, "generic_cover.jpg")
|
||||||
return send_from_directory(_STATIC_DIR, "generic_cover.jpg")
|
except PermissionError:
|
||||||
except PermissionError:
|
log.error("No permission to access generic_cover.jpg file.")
|
||||||
log.error("No permission to access generic_cover.jpg file.")
|
abort(403)
|
||||||
abort(403)
|
|
||||||
abort(404)
|
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover(book_id, resolution=None):
|
def get_book_cover(book_id, resolution=None):
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
return get_book_cover_internal(book, use_generic_cover_on_failure=True, resolution=resolution)
|
return get_book_cover_internal(book, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
# Called only by kobo sync -> cover not found should be answered with 404 and not with default cover
|
|
||||||
def get_book_cover_with_uuid(book_uuid, resolution=None):
|
def get_book_cover_with_uuid(book_uuid, resolution=None):
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
return get_book_cover_internal(book, use_generic_cover_on_failure=False, resolution=resolution)
|
if not book:
|
||||||
|
return # allows kobo.HandleCoverImageRequest to proxy request
|
||||||
|
return get_book_cover_internal(book, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover_internal(book, use_generic_cover_on_failure, resolution=None):
|
def get_book_cover_internal(book, resolution=None):
|
||||||
if book and book.has_cover:
|
if book and book.has_cover:
|
||||||
|
|
||||||
# Send the book cover thumbnail if it exists in cache
|
# Send the book cover thumbnail if it exists in cache
|
||||||
|
@ -727,26 +775,26 @@ def get_book_cover_internal(book, use_generic_cover_on_failure, resolution=None)
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
try:
|
try:
|
||||||
if not gd.is_gdrive_ready():
|
if not gd.is_gdrive_ready():
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure()
|
||||||
path = gd.get_cover_via_gdrive(book.path)
|
path = gd.get_cover_via_gdrive(book.path)
|
||||||
if path:
|
if path:
|
||||||
return redirect(path)
|
return redirect(path)
|
||||||
else:
|
else:
|
||||||
log.error('{}/cover.jpg not found on Google Drive'.format(book.path))
|
log.error('{}/cover.jpg not found on Google Drive'.format(book.path))
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure()
|
||||||
|
|
||||||
# Send the book cover from the Calibre directory
|
# Send the book cover from the Calibre directory
|
||||||
else:
|
else:
|
||||||
cover_file_path = os.path.join(config.config_calibre_dir, book.path)
|
cover_file_path = os.path.join(config.get_book_path(), book.path)
|
||||||
if os.path.isfile(os.path.join(cover_file_path, "cover.jpg")):
|
if os.path.isfile(os.path.join(cover_file_path, "cover.jpg")):
|
||||||
return send_from_directory(cover_file_path, "cover.jpg")
|
return send_from_directory(cover_file_path, "cover.jpg")
|
||||||
else:
|
else:
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure()
|
||||||
else:
|
else:
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure()
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover_thumbnail(book, resolution):
|
def get_book_cover_thumbnail(book, resolution):
|
||||||
|
@ -769,7 +817,7 @@ def get_series_thumbnail_on_failure(series_id, resolution):
|
||||||
.filter(db.Books.has_cover == 1) \
|
.filter(db.Books.has_cover == 1) \
|
||||||
.first()
|
.first()
|
||||||
|
|
||||||
return get_book_cover_internal(book, use_generic_cover_on_failure=True, resolution=resolution)
|
return get_book_cover_internal(book, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
def get_series_cover_thumbnail(series_id, resolution=None):
|
def get_series_cover_thumbnail(series_id, resolution=None):
|
||||||
|
@ -833,8 +881,8 @@ def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
try:
|
try:
|
||||||
os.makedirs(filepath)
|
os.makedirs(filepath)
|
||||||
except OSError:
|
except OSError:
|
||||||
log.error(u"Failed to create path for cover")
|
log.error("Failed to create path for cover")
|
||||||
return False, _(u"Failed to create path for cover")
|
return False, _("Failed to create path for cover")
|
||||||
try:
|
try:
|
||||||
# upload of jgp file without wand
|
# upload of jgp file without wand
|
||||||
if isinstance(img, requests.Response):
|
if isinstance(img, requests.Response):
|
||||||
|
@ -849,8 +897,8 @@ def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
# upload of jpg/png... from hdd
|
# upload of jpg/png... from hdd
|
||||||
img.save(os.path.join(filepath, saved_filename))
|
img.save(os.path.join(filepath, saved_filename))
|
||||||
except (IOError, OSError):
|
except (IOError, OSError):
|
||||||
log.error(u"Cover-file is not a valid image file, or could not be stored")
|
log.error("Cover-file is not a valid image file, or could not be stored")
|
||||||
return False, _(u"Cover-file is not a valid image file, or could not be stored")
|
return False, _("Cover-file is not a valid image file, or could not be stored")
|
||||||
return True, None
|
return True, None
|
||||||
|
|
||||||
|
|
||||||
|
@ -880,10 +928,7 @@ def save_cover(img, book_path):
|
||||||
return False, _("Only jpg/jpeg files are supported as coverfile")
|
return False, _("Only jpg/jpeg files are supported as coverfile")
|
||||||
|
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
tmp_dir = os.path.join(gettempdir(), 'calibre_web')
|
tmp_dir = get_temp_dir()
|
||||||
|
|
||||||
if not os.path.isdir(tmp_dir):
|
|
||||||
os.mkdir(tmp_dir)
|
|
||||||
ret, message = save_cover_from_filestorage(tmp_dir, "uploaded_cover.jpg", img)
|
ret, message = save_cover_from_filestorage(tmp_dir, "uploaded_cover.jpg", img)
|
||||||
if ret is True:
|
if ret is True:
|
||||||
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg').replace("\\", "/"),
|
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg').replace("\\", "/"),
|
||||||
|
@ -893,33 +938,72 @@ def save_cover(img, book_path):
|
||||||
else:
|
else:
|
||||||
return False, message
|
return False, message
|
||||||
else:
|
else:
|
||||||
return save_cover_from_filestorage(os.path.join(config.config_calibre_dir, book_path), "cover.jpg", img)
|
return save_cover_from_filestorage(os.path.join(config.get_book_path(), book_path), "cover.jpg", img)
|
||||||
|
|
||||||
|
|
||||||
def do_download_file(book, book_format, client, data, headers):
|
def do_download_file(book, book_format, client, data, headers):
|
||||||
|
book_name = data.name
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
# startTime = time.time()
|
# startTime = time.time()
|
||||||
df = gd.getFileFromEbooksFolder(book.path, data.name + "." + book_format)
|
df = gd.getFileFromEbooksFolder(book.path, book_name + "." + book_format)
|
||||||
# log.debug('%s', time.time() - startTime)
|
# log.debug('%s', time.time() - startTime)
|
||||||
if df:
|
if df:
|
||||||
return gd.do_gdrive_download(df, headers)
|
if config.config_embed_metadata and (
|
||||||
|
(book_format == "kepub" and config.config_kepubifypath ) or
|
||||||
|
(book_format != "kepub" and config.config_binariesdir)):
|
||||||
|
output_path = os.path.join(config.config_calibre_dir, book.path)
|
||||||
|
if not os.path.exists(output_path):
|
||||||
|
os.makedirs(output_path)
|
||||||
|
output = os.path.join(config.config_calibre_dir, book.path, book_name + "." + book_format)
|
||||||
|
gd.downloadFile(book.path, book_name + "." + book_format, output)
|
||||||
|
if book_format == "kepub" and config.config_kepubifypath:
|
||||||
|
filename, download_name = do_kepubify_metadata_replace(book, output)
|
||||||
|
elif book_format != "kepub" and config.config_binariesdir:
|
||||||
|
filename, download_name = do_calibre_export(book.id, book_format)
|
||||||
|
else:
|
||||||
|
return gd.do_gdrive_download(df, headers)
|
||||||
else:
|
else:
|
||||||
abort(404)
|
abort(404)
|
||||||
else:
|
else:
|
||||||
filename = os.path.join(config.config_calibre_dir, book.path)
|
filename = os.path.join(config.get_book_path(), book.path)
|
||||||
if not os.path.isfile(os.path.join(filename, data.name + "." + book_format)):
|
if not os.path.isfile(os.path.join(filename, book_name + "." + book_format)):
|
||||||
# ToDo: improve error handling
|
# ToDo: improve error handling
|
||||||
log.error('File not found: %s', os.path.join(filename, data.name + "." + book_format))
|
log.error('File not found: %s', os.path.join(filename, book_name + "." + book_format))
|
||||||
|
|
||||||
if client == "kobo" and book_format == "kepub":
|
if client == "kobo" and book_format == "kepub":
|
||||||
headers["Content-Disposition"] = headers["Content-Disposition"].replace(".kepub", ".kepub.epub")
|
headers["Content-Disposition"] = headers["Content-Disposition"].replace(".kepub", ".kepub.epub")
|
||||||
|
|
||||||
response = make_response(send_from_directory(filename, data.name + "." + book_format))
|
if book_format == "kepub" and config.config_kepubifypath and config.config_embed_metadata:
|
||||||
# ToDo Check headers parameter
|
filename, download_name = do_kepubify_metadata_replace(book, os.path.join(filename,
|
||||||
for element in headers:
|
book_name + "." + book_format))
|
||||||
response.headers[element[0]] = element[1]
|
elif book_format != "kepub" and config.config_binariesdir and config.config_embed_metadata:
|
||||||
log.info('Downloading file: {}'.format(os.path.join(filename, data.name + "." + book_format)))
|
filename, download_name = do_calibre_export(book.id, book_format)
|
||||||
return response
|
else:
|
||||||
|
download_name = book_name
|
||||||
|
|
||||||
|
response = make_response(send_from_directory(filename, download_name + "." + book_format))
|
||||||
|
# ToDo Check headers parameter
|
||||||
|
for element in headers:
|
||||||
|
response.headers[element[0]] = element[1]
|
||||||
|
log.info('Downloading file: {}'.format(os.path.join(filename, book_name + "." + book_format)))
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
def do_kepubify_metadata_replace(book, file_path):
|
||||||
|
custom_columns = (calibre_db.session.query(db.CustomColumns)
|
||||||
|
.filter(db.CustomColumns.mark_for_delete == 0)
|
||||||
|
.filter(db.CustomColumns.datatype.notin_(db.cc_exceptions))
|
||||||
|
.order_by(db.CustomColumns.label).all())
|
||||||
|
|
||||||
|
tree, cf_name = get_content_opf(file_path)
|
||||||
|
package = create_new_metadata_backup(book, custom_columns, current_user.locale, _("Cover"), lang_type=2)
|
||||||
|
content = replace_metadata(tree, package)
|
||||||
|
tmp_dir = get_temp_dir()
|
||||||
|
temp_file_name = str(uuid4())
|
||||||
|
# open zipfile and replace metadata block in content.opf
|
||||||
|
updateEpub(file_path, os.path.join(tmp_dir, temp_file_name + ".kepub"), cf_name, content)
|
||||||
|
return tmp_dir, temp_file_name
|
||||||
|
|
||||||
|
|
||||||
##################################
|
##################################
|
||||||
|
|
||||||
|
@ -943,6 +1027,47 @@ def check_unrar(unrar_location):
|
||||||
return _('Error executing UnRar')
|
return _('Error executing UnRar')
|
||||||
|
|
||||||
|
|
||||||
|
def check_calibre(calibre_location):
|
||||||
|
if not calibre_location:
|
||||||
|
return
|
||||||
|
|
||||||
|
if not os.path.exists(calibre_location):
|
||||||
|
return _('Could not find the specified directory')
|
||||||
|
|
||||||
|
if not os.path.isdir(calibre_location):
|
||||||
|
return _('Please specify a directory, not a file')
|
||||||
|
|
||||||
|
try:
|
||||||
|
supported_binary_paths = [os.path.join(calibre_location, binary)
|
||||||
|
for binary in SUPPORTED_CALIBRE_BINARIES.values()]
|
||||||
|
binaries_available = [os.path.isfile(binary_path) for binary_path in supported_binary_paths]
|
||||||
|
binaries_executable = [os.access(binary_path, os.X_OK) for binary_path in supported_binary_paths]
|
||||||
|
if all(binaries_available) and all(binaries_executable):
|
||||||
|
values = [process_wait([binary_path, "--version"], pattern=r'\(calibre (.*)\)')
|
||||||
|
for binary_path in supported_binary_paths]
|
||||||
|
if all(values):
|
||||||
|
version = values[0].group(1)
|
||||||
|
log.debug("calibre version %s", version)
|
||||||
|
else:
|
||||||
|
return _('Calibre binaries not viable')
|
||||||
|
else:
|
||||||
|
ret_val = []
|
||||||
|
missing_binaries=[path for path, available in
|
||||||
|
zip(SUPPORTED_CALIBRE_BINARIES.values(), binaries_available) if not available]
|
||||||
|
|
||||||
|
missing_perms=[path for path, available in
|
||||||
|
zip(SUPPORTED_CALIBRE_BINARIES.values(), binaries_executable) if not available]
|
||||||
|
if missing_binaries:
|
||||||
|
ret_val.append(_('Missing calibre binaries: %(missing)s', missing=", ".join(missing_binaries)))
|
||||||
|
if missing_perms:
|
||||||
|
ret_val.append(_('Missing executable permissions: %(missing)s', missing=", ".join(missing_perms)))
|
||||||
|
return ", ".join(ret_val)
|
||||||
|
|
||||||
|
except (OSError, UnicodeDecodeError) as err:
|
||||||
|
log.error_or_exception(err)
|
||||||
|
return _('Error excecuting Calibre')
|
||||||
|
|
||||||
|
|
||||||
def json_serial(obj):
|
def json_serial(obj):
|
||||||
"""JSON serializer for objects not serializable by default json code"""
|
"""JSON serializer for objects not serializable by default json code"""
|
||||||
|
|
||||||
|
@ -967,43 +1092,38 @@ def tags_filters():
|
||||||
|
|
||||||
|
|
||||||
# checks if domain is in database (including wildcards)
|
# checks if domain is in database (including wildcards)
|
||||||
# example SELECT * FROM @TABLE WHERE 'abcdefg' LIKE Name;
|
# example SELECT * FROM @TABLE WHERE 'abcdefg' LIKE Name;
|
||||||
# from https://code.luasoftware.com/tutorials/flask/execute-raw-sql-in-flask-sqlalchemy/
|
# from https://code.luasoftware.com/tutorials/flask/execute-raw-sql-in-flask-sqlalchemy/
|
||||||
# in all calls the email address is checked for validity
|
# in all calls the email address is checked for validity
|
||||||
def check_valid_domain(domain_text):
|
def check_valid_domain(domain_text):
|
||||||
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 1);"
|
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 1);"
|
||||||
result = ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all()
|
if not len(ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all()):
|
||||||
if not len(result):
|
|
||||||
return False
|
return False
|
||||||
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 0);"
|
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 0);"
|
||||||
result = ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all()
|
return not len(ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all())
|
||||||
return not len(result)
|
|
||||||
|
|
||||||
|
|
||||||
def get_download_link(book_id, book_format, client):
|
def get_download_link(book_id, book_format, client):
|
||||||
book_format = book_format.split(".")[0]
|
book_format = book_format.split(".")[0]
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
data1= ""
|
|
||||||
if book:
|
if book:
|
||||||
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
||||||
|
if data1:
|
||||||
|
# collect downloaded books only for registered user and not for anonymous user
|
||||||
|
if current_user.is_authenticated:
|
||||||
|
ub.update_download(book_id, int(current_user.id))
|
||||||
|
file_name = book.title
|
||||||
|
if len(book.authors) > 0:
|
||||||
|
file_name = file_name + ' - ' + book.authors[0].name
|
||||||
|
file_name = get_valid_filename(file_name, replace_whitespace=False)
|
||||||
|
headers = Headers()
|
||||||
|
headers["Content-Type"] = mimetypes.types_map.get('.' + book_format, "application/octet-stream")
|
||||||
|
headers["Content-Disposition"] = "attachment; filename=%s.%s; filename*=UTF-8''%s.%s" % (
|
||||||
|
quote(file_name), book_format, quote(file_name), book_format)
|
||||||
|
return do_download_file(book, book_format, client, data1, headers)
|
||||||
else:
|
else:
|
||||||
log.error("Book id {} not found for downloading".format(book_id))
|
log.error("Book id {} not found for downloading".format(book_id))
|
||||||
abort(404)
|
abort(404)
|
||||||
if data1:
|
|
||||||
# collect downloaded books only for registered user and not for anonymous user
|
|
||||||
if current_user.is_authenticated:
|
|
||||||
ub.update_download(book_id, int(current_user.id))
|
|
||||||
file_name = book.title
|
|
||||||
if len(book.authors) > 0:
|
|
||||||
file_name = file_name + ' - ' + book.authors[0].name
|
|
||||||
file_name = get_valid_filename(file_name, replace_whitespace=False)
|
|
||||||
headers = Headers()
|
|
||||||
headers["Content-Type"] = mimetypes.types_map.get('.' + book_format, "application/octet-stream")
|
|
||||||
headers["Content-Disposition"] = "attachment; filename=%s.%s; filename*=UTF-8''%s.%s" % (
|
|
||||||
quote(file_name.encode('utf-8')), book_format, quote(file_name.encode('utf-8')), book_format)
|
|
||||||
return do_download_file(book, book_format, client, data1, headers)
|
|
||||||
else:
|
|
||||||
abort(404)
|
|
||||||
|
|
||||||
|
|
||||||
def clear_cover_thumbnail_cache(book_id):
|
def clear_cover_thumbnail_cache(book_id):
|
||||||
|
@ -1029,3 +1149,11 @@ def add_book_to_thumbnail_cache(book_id):
|
||||||
def update_thumbnail_cache():
|
def update_thumbnail_cache():
|
||||||
if config.schedule_generate_book_covers:
|
if config.schedule_generate_book_covers:
|
||||||
WorkerThread.add(None, TaskGenerateCoverThumbnails())
|
WorkerThread.add(None, TaskGenerateCoverThumbnails())
|
||||||
|
|
||||||
|
|
||||||
|
def set_all_metadata_dirty():
|
||||||
|
WorkerThread.add(None, TaskBackupMetadata(export_language=get_locale(),
|
||||||
|
translated_title=_("Cover"),
|
||||||
|
set_dirty=True,
|
||||||
|
task_message=N_("Queue all books for metadata backup")),
|
||||||
|
hidden=False)
|
||||||
|
|
|
@ -124,7 +124,7 @@ def formatseriesindex_filter(series_index):
|
||||||
return int(series_index)
|
return int(series_index)
|
||||||
else:
|
else:
|
||||||
return series_index
|
return series_index
|
||||||
except ValueError:
|
except (ValueError, TypeError):
|
||||||
return series_index
|
return series_index
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
182
cps/kobo.py
|
@ -21,6 +21,7 @@ import base64
|
||||||
import datetime
|
import datetime
|
||||||
import os
|
import os
|
||||||
import uuid
|
import uuid
|
||||||
|
import zipfile
|
||||||
from time import gmtime, strftime
|
from time import gmtime, strftime
|
||||||
import json
|
import json
|
||||||
from urllib.parse import unquote
|
from urllib.parse import unquote
|
||||||
|
@ -45,7 +46,9 @@ import requests
|
||||||
|
|
||||||
|
|
||||||
from . import config, logger, kobo_auth, db, calibre_db, helper, shelf as shelf_lib, ub, csrf, kobo_sync_status
|
from . import config, logger, kobo_auth, db, calibre_db, helper, shelf as shelf_lib, ub, csrf, kobo_sync_status
|
||||||
from .constants import sqlalchemy_version2, COVER_THUMBNAIL_SMALL
|
from . import isoLanguages
|
||||||
|
from .epub import get_epub_layout
|
||||||
|
from .constants import COVER_THUMBNAIL_SMALL #, sqlalchemy_version2
|
||||||
from .helper import get_download_link
|
from .helper import get_download_link
|
||||||
from .services import SyncToken as SyncToken
|
from .services import SyncToken as SyncToken
|
||||||
from .web import download_required
|
from .web import download_required
|
||||||
|
@ -53,7 +56,7 @@ from .kobo_auth import requires_kobo_auth, get_auth_token
|
||||||
|
|
||||||
KOBO_FORMATS = {"KEPUB": ["KEPUB"], "EPUB": ["EPUB3", "EPUB"]}
|
KOBO_FORMATS = {"KEPUB": ["KEPUB"], "EPUB": ["EPUB3", "EPUB"]}
|
||||||
KOBO_STOREAPI_URL = "https://storeapi.kobo.com"
|
KOBO_STOREAPI_URL = "https://storeapi.kobo.com"
|
||||||
KOBO_IMAGEHOST_URL = "https://kbimages1-a.akamaihd.net"
|
KOBO_IMAGEHOST_URL = "https://cdn.kobo.com/book-images"
|
||||||
|
|
||||||
SYNC_ITEM_LIMIT = 100
|
SYNC_ITEM_LIMIT = 100
|
||||||
|
|
||||||
|
@ -134,11 +137,15 @@ def convert_to_kobo_timestamp_string(timestamp):
|
||||||
|
|
||||||
@kobo.route("/v1/library/sync")
|
@kobo.route("/v1/library/sync")
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
@download_required
|
# @download_required
|
||||||
def HandleSyncRequest():
|
def HandleSyncRequest():
|
||||||
|
if not current_user.role_download():
|
||||||
|
log.info("Users need download permissions for syncing library to Kobo reader")
|
||||||
|
return abort(403)
|
||||||
sync_token = SyncToken.SyncToken.from_headers(request.headers)
|
sync_token = SyncToken.SyncToken.from_headers(request.headers)
|
||||||
log.info("Kobo library sync request received.")
|
log.info("Kobo library sync request received")
|
||||||
log.debug("SyncToken: {}".format(sync_token))
|
log.debug("SyncToken: {}".format(sync_token))
|
||||||
|
log.debug("Download link format {}".format(get_download_url_for_book('[bookid]','[bookformat]')))
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||||
|
|
||||||
|
@ -162,16 +169,10 @@ def HandleSyncRequest():
|
||||||
only_kobo_shelves = current_user.kobo_only_shelves_sync
|
only_kobo_shelves = current_user.kobo_only_shelves_sync
|
||||||
|
|
||||||
if only_kobo_shelves:
|
if only_kobo_shelves:
|
||||||
if sqlalchemy_version2:
|
changed_entries = calibre_db.session.query(db.Books,
|
||||||
changed_entries = select(db.Books,
|
ub.ArchivedBook.last_modified,
|
||||||
ub.ArchivedBook.last_modified,
|
ub.BookShelf.date_added,
|
||||||
ub.BookShelf.date_added,
|
ub.ArchivedBook.is_archived)
|
||||||
ub.ArchivedBook.is_archived)
|
|
||||||
else:
|
|
||||||
changed_entries = calibre_db.session.query(db.Books,
|
|
||||||
ub.ArchivedBook.last_modified,
|
|
||||||
ub.BookShelf.date_added,
|
|
||||||
ub.ArchivedBook.is_archived)
|
|
||||||
changed_entries = (changed_entries
|
changed_entries = (changed_entries
|
||||||
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
||||||
ub.ArchivedBook.user_id == current_user.id))
|
ub.ArchivedBook.user_id == current_user.id))
|
||||||
|
@ -188,12 +189,9 @@ def HandleSyncRequest():
|
||||||
.filter(ub.Shelf.kobo_sync)
|
.filter(ub.Shelf.kobo_sync)
|
||||||
.distinct())
|
.distinct())
|
||||||
else:
|
else:
|
||||||
if sqlalchemy_version2:
|
changed_entries = calibre_db.session.query(db.Books,
|
||||||
changed_entries = select(db.Books, ub.ArchivedBook.last_modified, ub.ArchivedBook.is_archived)
|
ub.ArchivedBook.last_modified,
|
||||||
else:
|
ub.ArchivedBook.is_archived)
|
||||||
changed_entries = calibre_db.session.query(db.Books,
|
|
||||||
ub.ArchivedBook.last_modified,
|
|
||||||
ub.ArchivedBook.is_archived)
|
|
||||||
changed_entries = (changed_entries
|
changed_entries = (changed_entries
|
||||||
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
||||||
ub.ArchivedBook.user_id == current_user.id))
|
ub.ArchivedBook.user_id == current_user.id))
|
||||||
|
@ -205,15 +203,12 @@ def HandleSyncRequest():
|
||||||
.order_by(db.Books.id))
|
.order_by(db.Books.id))
|
||||||
|
|
||||||
reading_states_in_new_entitlements = []
|
reading_states_in_new_entitlements = []
|
||||||
if sqlalchemy_version2:
|
books = changed_entries.limit(SYNC_ITEM_LIMIT)
|
||||||
books = calibre_db.session.execute(changed_entries.limit(SYNC_ITEM_LIMIT))
|
|
||||||
else:
|
|
||||||
books = changed_entries.limit(SYNC_ITEM_LIMIT)
|
|
||||||
log.debug("Books to Sync: {}".format(len(books.all())))
|
log.debug("Books to Sync: {}".format(len(books.all())))
|
||||||
for book in books:
|
for book in books:
|
||||||
formats = [data.format for data in book.Books.data]
|
formats = [data.format for data in book.Books.data]
|
||||||
if 'KEPUB' not in formats and config.config_kepubifypath and 'EPUB' in formats:
|
if 'KEPUB' not in formats and config.config_kepubifypath and 'EPUB' in formats:
|
||||||
helper.convert_book_format(book.Books.id, config.config_calibre_dir, 'EPUB', 'KEPUB', current_user.name)
|
helper.convert_book_format(book.Books.id, config.get_book_path(), 'EPUB', 'KEPUB', current_user.name)
|
||||||
|
|
||||||
kobo_reading_state = get_or_create_reading_state(book.Books.id)
|
kobo_reading_state = get_or_create_reading_state(book.Books.id)
|
||||||
entitlement = {
|
entitlement = {
|
||||||
|
@ -226,7 +221,7 @@ def HandleSyncRequest():
|
||||||
new_reading_state_last_modified = max(new_reading_state_last_modified, kobo_reading_state.last_modified)
|
new_reading_state_last_modified = max(new_reading_state_last_modified, kobo_reading_state.last_modified)
|
||||||
reading_states_in_new_entitlements.append(book.Books.id)
|
reading_states_in_new_entitlements.append(book.Books.id)
|
||||||
|
|
||||||
ts_created = book.Books.timestamp
|
ts_created = book.Books.timestamp.replace(tzinfo=None)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
ts_created = max(ts_created, book.date_added)
|
ts_created = max(ts_created, book.date_added)
|
||||||
|
@ -239,7 +234,7 @@ def HandleSyncRequest():
|
||||||
sync_results.append({"ChangedEntitlement": entitlement})
|
sync_results.append({"ChangedEntitlement": entitlement})
|
||||||
|
|
||||||
new_books_last_modified = max(
|
new_books_last_modified = max(
|
||||||
book.Books.last_modified, new_books_last_modified
|
book.Books.last_modified.replace(tzinfo=None), new_books_last_modified
|
||||||
)
|
)
|
||||||
try:
|
try:
|
||||||
new_books_last_modified = max(
|
new_books_last_modified = max(
|
||||||
|
@ -251,27 +246,16 @@ def HandleSyncRequest():
|
||||||
new_books_last_created = max(ts_created, new_books_last_created)
|
new_books_last_created = max(ts_created, new_books_last_created)
|
||||||
kobo_sync_status.add_synced_books(book.Books.id)
|
kobo_sync_status.add_synced_books(book.Books.id)
|
||||||
|
|
||||||
if sqlalchemy_version2:
|
max_change = changed_entries.filter(ub.ArchivedBook.is_archived)\
|
||||||
max_change = calibre_db.session.execute(changed_entries
|
.filter(ub.ArchivedBook.user_id == current_user.id) \
|
||||||
.filter(ub.ArchivedBook.is_archived)
|
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc()).first()
|
||||||
.filter(ub.ArchivedBook.user_id == current_user.id)
|
|
||||||
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc()))\
|
|
||||||
.columns(db.Books).first()
|
|
||||||
else:
|
|
||||||
max_change = changed_entries.from_self().filter(ub.ArchivedBook.is_archived)\
|
|
||||||
.filter(ub.ArchivedBook.user_id == current_user.id) \
|
|
||||||
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc()).first()
|
|
||||||
|
|
||||||
max_change = max_change.last_modified if max_change else new_archived_last_modified
|
max_change = max_change.last_modified if max_change else new_archived_last_modified
|
||||||
|
|
||||||
new_archived_last_modified = max(new_archived_last_modified, max_change)
|
new_archived_last_modified = max(new_archived_last_modified, max_change)
|
||||||
|
|
||||||
# no. of books returned
|
# no. of books returned
|
||||||
if sqlalchemy_version2:
|
book_count = changed_entries.count()
|
||||||
entries = calibre_db.session.execute(changed_entries).all()
|
|
||||||
book_count = len(entries)
|
|
||||||
else:
|
|
||||||
book_count = changed_entries.count()
|
|
||||||
# last entry:
|
# last entry:
|
||||||
cont_sync = bool(book_count)
|
cont_sync = bool(book_count)
|
||||||
log.debug("Remaining books to Sync: {}".format(book_count))
|
log.debug("Remaining books to Sync: {}".format(book_count))
|
||||||
|
@ -334,7 +318,7 @@ def generate_sync_response(sync_token, sync_results, set_cont=False):
|
||||||
extra_headers["x-kobo-recent-reads"] = store_response.headers.get("x-kobo-recent-reads")
|
extra_headers["x-kobo-recent-reads"] = store_response.headers.get("x-kobo-recent-reads")
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error("Failed to receive or parse response from Kobo's sync endpoint: {}".format(ex))
|
log.error_or_exception("Failed to receive or parse response from Kobo's sync endpoint: {}".format(ex))
|
||||||
if set_cont:
|
if set_cont:
|
||||||
extra_headers["x-kobo-sync"] = "continue"
|
extra_headers["x-kobo-sync"] = "continue"
|
||||||
sync_token.to_headers(extra_headers)
|
sync_token.to_headers(extra_headers)
|
||||||
|
@ -355,7 +339,7 @@ def HandleMetadataRequest(book_uuid):
|
||||||
log.info("Kobo library metadata request received for book %s" % book_uuid)
|
log.info("Kobo library metadata request received for book %s" % book_uuid)
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
if not book or not book.data:
|
if not book or not book.data:
|
||||||
log.info(u"Book %s not found in database", book_uuid)
|
log.info("Book %s not found in database", book_uuid)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
metadata = get_metadata(book)
|
metadata = get_metadata(book)
|
||||||
|
@ -364,7 +348,7 @@ def HandleMetadataRequest(book_uuid):
|
||||||
return response
|
return response
|
||||||
|
|
||||||
|
|
||||||
def get_download_url_for_book(book, book_format):
|
def get_download_url_for_book(book_id, book_format):
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
if ':' in request.host and not request.host.endswith(']'):
|
if ':' in request.host and not request.host.endswith(']'):
|
||||||
host = "".join(request.host.split(':')[:-1])
|
host = "".join(request.host.split(':')[:-1])
|
||||||
|
@ -376,13 +360,13 @@ def get_download_url_for_book(book, book_format):
|
||||||
url_base=host,
|
url_base=host,
|
||||||
url_port=config.config_external_port,
|
url_port=config.config_external_port,
|
||||||
auth_token=get_auth_token(),
|
auth_token=get_auth_token(),
|
||||||
book_id=book.id,
|
book_id=book_id,
|
||||||
book_format=book_format.lower()
|
book_format=book_format.lower()
|
||||||
)
|
)
|
||||||
return url_for(
|
return url_for(
|
||||||
"kobo.download_book",
|
"kobo.download_book",
|
||||||
auth_token=kobo_auth.get_auth_token(),
|
auth_token=kobo_auth.get_auth_token(),
|
||||||
book_id=book.id,
|
book_id=book_id,
|
||||||
book_format=book_format.lower(),
|
book_format=book_format.lower(),
|
||||||
_external=True,
|
_external=True,
|
||||||
)
|
)
|
||||||
|
@ -443,6 +427,12 @@ def get_seriesindex(book):
|
||||||
return book.series_index or 1
|
return book.series_index or 1
|
||||||
|
|
||||||
|
|
||||||
|
def get_language(book):
|
||||||
|
if not book.languages:
|
||||||
|
return 'en'
|
||||||
|
return isoLanguages.get(part3=book.languages[0].lang_code).part1
|
||||||
|
|
||||||
|
|
||||||
def get_metadata(book):
|
def get_metadata(book):
|
||||||
download_urls = []
|
download_urls = []
|
||||||
kepub = [data for data in book.data if data.format == 'KEPUB']
|
kepub = [data for data in book.data if data.format == 'KEPUB']
|
||||||
|
@ -452,16 +442,21 @@ def get_metadata(book):
|
||||||
continue
|
continue
|
||||||
for kobo_format in KOBO_FORMATS[book_data.format]:
|
for kobo_format in KOBO_FORMATS[book_data.format]:
|
||||||
# log.debug('Id: %s, Format: %s' % (book.id, kobo_format))
|
# log.debug('Id: %s, Format: %s' % (book.id, kobo_format))
|
||||||
download_urls.append(
|
try:
|
||||||
{
|
if get_epub_layout(book, book_data) == 'pre-paginated':
|
||||||
"Format": kobo_format,
|
kobo_format = 'EPUB3FL'
|
||||||
"Size": book_data.uncompressed_size,
|
download_urls.append(
|
||||||
"Url": get_download_url_for_book(book, book_data.format),
|
{
|
||||||
# The Kobo forma accepts platforms: (Generic, Android)
|
"Format": kobo_format,
|
||||||
"Platform": "Generic",
|
"Size": book_data.uncompressed_size,
|
||||||
# "DrmType": "None", # Not required
|
"Url": get_download_url_for_book(book.id, book_data.format),
|
||||||
}
|
# The Kobo forma accepts platforms: (Generic, Android)
|
||||||
)
|
"Platform": "Generic",
|
||||||
|
# "DrmType": "None", # Not required
|
||||||
|
}
|
||||||
|
)
|
||||||
|
except (zipfile.BadZipfile, FileNotFoundError) as e:
|
||||||
|
log.error(e)
|
||||||
|
|
||||||
book_uuid = book.uuid
|
book_uuid = book.uuid
|
||||||
metadata = {
|
metadata = {
|
||||||
|
@ -480,7 +475,7 @@ def get_metadata(book):
|
||||||
"IsInternetArchive": False,
|
"IsInternetArchive": False,
|
||||||
"IsPreOrder": False,
|
"IsPreOrder": False,
|
||||||
"IsSocialEnabled": True,
|
"IsSocialEnabled": True,
|
||||||
"Language": "en",
|
"Language": get_language(book),
|
||||||
"PhoneticPronunciations": {},
|
"PhoneticPronunciations": {},
|
||||||
"PublicationDate": convert_to_kobo_timestamp_string(book.pubdate),
|
"PublicationDate": convert_to_kobo_timestamp_string(book.pubdate),
|
||||||
"Publisher": {"Imprint": "", "Name": get_publisher(book), },
|
"Publisher": {"Imprint": "", "Name": get_publisher(book), },
|
||||||
|
@ -508,7 +503,7 @@ def get_metadata(book):
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
# Creates a Shelf with the given items, and returns the shelf's uuid.
|
# Creates a Shelf with the given items, and returns the shelf's uuid.
|
||||||
def HandleTagCreate():
|
def HandleTagCreate():
|
||||||
# catch delete requests, otherwise the are handled in the book delete handler
|
# catch delete requests, otherwise they are handled in the book delete handler
|
||||||
if request.method == "DELETE":
|
if request.method == "DELETE":
|
||||||
abort(405)
|
abort(405)
|
||||||
name, items = None, None
|
name, items = None, None
|
||||||
|
@ -702,20 +697,12 @@ def sync_shelves(sync_token, sync_results, only_kobo_shelves=False):
|
||||||
})
|
})
|
||||||
extra_filters.append(ub.Shelf.kobo_sync)
|
extra_filters.append(ub.Shelf.kobo_sync)
|
||||||
|
|
||||||
if sqlalchemy_version2:
|
shelflist = ub.session.query(ub.Shelf).outerjoin(ub.BookShelf).filter(
|
||||||
shelflist = ub.session.execute(select(ub.Shelf).outerjoin(ub.BookShelf).filter(
|
or_(func.datetime(ub.Shelf.last_modified) > sync_token.tags_last_modified,
|
||||||
or_(func.datetime(ub.Shelf.last_modified) > sync_token.tags_last_modified,
|
func.datetime(ub.BookShelf.date_added) > sync_token.tags_last_modified),
|
||||||
func.datetime(ub.BookShelf.date_added) > sync_token.tags_last_modified),
|
ub.Shelf.user_id == current_user.id,
|
||||||
ub.Shelf.user_id == current_user.id,
|
*extra_filters
|
||||||
*extra_filters
|
).distinct().order_by(func.datetime(ub.Shelf.last_modified).asc())
|
||||||
).distinct().order_by(func.datetime(ub.Shelf.last_modified).asc())).columns(ub.Shelf)
|
|
||||||
else:
|
|
||||||
shelflist = ub.session.query(ub.Shelf).outerjoin(ub.BookShelf).filter(
|
|
||||||
or_(func.datetime(ub.Shelf.last_modified) > sync_token.tags_last_modified,
|
|
||||||
func.datetime(ub.BookShelf.date_added) > sync_token.tags_last_modified),
|
|
||||||
ub.Shelf.user_id == current_user.id,
|
|
||||||
*extra_filters
|
|
||||||
).distinct().order_by(func.datetime(ub.Shelf.last_modified).asc())
|
|
||||||
|
|
||||||
for shelf in shelflist:
|
for shelf in shelflist:
|
||||||
if not shelf_lib.check_shelf_view_permissions(shelf):
|
if not shelf_lib.check_shelf_view_permissions(shelf):
|
||||||
|
@ -752,7 +739,7 @@ def create_kobo_tag(shelf):
|
||||||
for book_shelf in shelf.books:
|
for book_shelf in shelf.books:
|
||||||
book = calibre_db.get_book(book_shelf.book_id)
|
book = calibre_db.get_book(book_shelf.book_id)
|
||||||
if not book:
|
if not book:
|
||||||
log.info(u"Book (id: %s) in BookShelf (id: %s) not found in book database", book_shelf.book_id, shelf.id)
|
log.info("Book (id: %s) in BookShelf (id: %s) not found in book database", book_shelf.book_id, shelf.id)
|
||||||
continue
|
continue
|
||||||
tag["Items"].append(
|
tag["Items"].append(
|
||||||
{
|
{
|
||||||
|
@ -769,7 +756,7 @@ def create_kobo_tag(shelf):
|
||||||
def HandleStateRequest(book_uuid):
|
def HandleStateRequest(book_uuid):
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
if not book or not book.data:
|
if not book or not book.data:
|
||||||
log.info(u"Book %s not found in database", book_uuid)
|
log.info("Book %s not found in database", book_uuid)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
kobo_reading_state = get_or_create_reading_state(book.id)
|
kobo_reading_state = get_or_create_reading_state(book.id)
|
||||||
|
@ -916,20 +903,26 @@ def get_current_bookmark_response(current_bookmark):
|
||||||
@kobo.route("/<book_uuid>/<width>/<height>/<Quality>/<isGreyscale>/image.jpg")
|
@kobo.route("/<book_uuid>/<width>/<height>/<Quality>/<isGreyscale>/image.jpg")
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
def HandleCoverImageRequest(book_uuid, width, height, Quality, isGreyscale):
|
def HandleCoverImageRequest(book_uuid, width, height, Quality, isGreyscale):
|
||||||
book_cover = helper.get_book_cover_with_uuid(book_uuid, resolution=COVER_THUMBNAIL_SMALL)
|
try:
|
||||||
if not book_cover:
|
resolution = None if int(height) > 1000 else COVER_THUMBNAIL_SMALL
|
||||||
if config.config_kobo_proxy:
|
except ValueError:
|
||||||
log.debug("Cover for unknown book: %s proxied to kobo" % book_uuid)
|
log.error("Requested height %s of book %s is invalid" % (book_uuid, height))
|
||||||
return redirect(KOBO_IMAGEHOST_URL +
|
resolution = COVER_THUMBNAIL_SMALL
|
||||||
"/{book_uuid}/{width}/{height}/false/image.jpg".format(book_uuid=book_uuid,
|
book_cover = helper.get_book_cover_with_uuid(book_uuid, resolution=resolution)
|
||||||
width=width,
|
if book_cover:
|
||||||
height=height), 307)
|
log.debug("Serving local cover image of book %s" % book_uuid)
|
||||||
else:
|
return book_cover
|
||||||
log.debug("Cover for unknown book: %s requested" % book_uuid)
|
|
||||||
# additional proxy request make no sense, -> direct return
|
if not config.config_kobo_proxy:
|
||||||
return make_response(jsonify({}))
|
log.debug("Returning 404 for cover image of unknown book %s" % book_uuid)
|
||||||
log.debug("Cover request received for book %s" % book_uuid)
|
# additional proxy request make no sense, -> direct return
|
||||||
return book_cover
|
return abort(404)
|
||||||
|
|
||||||
|
log.debug("Redirecting request for cover image of unknown book %s to Kobo" % book_uuid)
|
||||||
|
return redirect(KOBO_IMAGEHOST_URL +
|
||||||
|
"/{book_uuid}/{width}/{height}/false/image.jpg".format(book_uuid=book_uuid,
|
||||||
|
width=width,
|
||||||
|
height=height), 307)
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("")
|
@kobo.route("")
|
||||||
|
@ -944,7 +937,7 @@ def HandleBookDeletionRequest(book_uuid):
|
||||||
log.info("Kobo book delete request received for book %s" % book_uuid)
|
log.info("Kobo book delete request received for book %s" % book_uuid)
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
if not book:
|
if not book:
|
||||||
log.info(u"Book %s not found in database", book_uuid)
|
log.info("Book %s not found in database", book_uuid)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
book_id = book.id
|
book_id = book.id
|
||||||
|
@ -958,7 +951,7 @@ def HandleBookDeletionRequest(book_uuid):
|
||||||
@csrf.exempt
|
@csrf.exempt
|
||||||
@kobo.route("/v1/library/<dummy>", methods=["DELETE", "GET"])
|
@kobo.route("/v1/library/<dummy>", methods=["DELETE", "GET"])
|
||||||
def HandleUnimplementedRequest(dummy=None):
|
def HandleUnimplementedRequest(dummy=None):
|
||||||
log.debug("Unimplemented Library Request received: %s", request.base_url)
|
log.debug("Unimplemented Library Request received: %s (request is forwarded to kobo if configured)", request.base_url)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
|
|
||||||
|
@ -969,8 +962,9 @@ def HandleUnimplementedRequest(dummy=None):
|
||||||
@kobo.route("/v1/user/wishlist", methods=["GET", "POST"])
|
@kobo.route("/v1/user/wishlist", methods=["GET", "POST"])
|
||||||
@kobo.route("/v1/user/recommendations", methods=["GET", "POST"])
|
@kobo.route("/v1/user/recommendations", methods=["GET", "POST"])
|
||||||
@kobo.route("/v1/analytics/<dummy>", methods=["GET", "POST"])
|
@kobo.route("/v1/analytics/<dummy>", methods=["GET", "POST"])
|
||||||
|
@kobo.route("/v1/assets", methods=["GET"])
|
||||||
def HandleUserRequest(dummy=None):
|
def HandleUserRequest(dummy=None):
|
||||||
log.debug("Unimplemented User Request received: %s", request.base_url)
|
log.debug("Unimplemented User Request received: %s (request is forwarded to kobo if configured)", request.base_url)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
|
|
||||||
|
@ -1010,7 +1004,7 @@ def handle_getests():
|
||||||
@kobo.route("/v1/affiliate", methods=["GET", "POST"])
|
@kobo.route("/v1/affiliate", methods=["GET", "POST"])
|
||||||
@kobo.route("/v1/deals", methods=["GET", "POST"])
|
@kobo.route("/v1/deals", methods=["GET", "POST"])
|
||||||
def HandleProductsRequest(dummy=None):
|
def HandleProductsRequest(dummy=None):
|
||||||
log.debug("Unimplemented Products Request received: %s", request.base_url)
|
log.debug("Unimplemented Products Request received: %s (request is forwarded to kobo if configured)", request.base_url)
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
|
|
||||||
|
@ -1027,7 +1021,7 @@ def make_calibre_web_auth_response():
|
||||||
"RefreshToken": RefreshToken,
|
"RefreshToken": RefreshToken,
|
||||||
"TokenType": "Bearer",
|
"TokenType": "Bearer",
|
||||||
"TrackingId": str(uuid.uuid4()),
|
"TrackingId": str(uuid.uuid4()),
|
||||||
"UserKey": content['UserKey'],
|
"UserKey": content.get('UserKey',""),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
|
@ -64,11 +64,12 @@ from datetime import datetime
|
||||||
from os import urandom
|
from os import urandom
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
|
||||||
from flask import g, Blueprint, url_for, abort, request
|
from flask import g, Blueprint, abort, request
|
||||||
from flask_login import login_user, current_user, login_required
|
from flask_login import login_user, current_user, login_required
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
from flask_limiter import RateLimitExceeded
|
||||||
|
|
||||||
from . import logger, config, calibre_db, db, helper, ub, lm
|
from . import logger, config, calibre_db, db, helper, ub, lm, limiter
|
||||||
from .render_template import render_title_template
|
from .render_template import render_title_template
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -112,7 +113,7 @@ def generate_auth_token(user_id):
|
||||||
|
|
||||||
return render_title_template(
|
return render_title_template(
|
||||||
"generate_kobo_auth_url.html",
|
"generate_kobo_auth_url.html",
|
||||||
title=_(u"Kobo Setup"),
|
title=_("Kobo Setup"),
|
||||||
auth_token=auth_token.auth_token,
|
auth_token=auth_token.auth_token,
|
||||||
warning = warning
|
warning = warning
|
||||||
)
|
)
|
||||||
|
@ -151,6 +152,13 @@ def requires_kobo_auth(f):
|
||||||
def inner(*args, **kwargs):
|
def inner(*args, **kwargs):
|
||||||
auth_token = get_auth_token()
|
auth_token = get_auth_token()
|
||||||
if auth_token is not None:
|
if auth_token is not None:
|
||||||
|
try:
|
||||||
|
limiter.check()
|
||||||
|
except RateLimitExceeded:
|
||||||
|
return abort(429)
|
||||||
|
except (ConnectionError, Exception) as e:
|
||||||
|
log.error("Connection error to limiter backend: %s", e)
|
||||||
|
return abort(429)
|
||||||
user = (
|
user = (
|
||||||
ub.session.query(ub.User)
|
ub.session.query(ub.User)
|
||||||
.join(ub.RemoteAuthToken)
|
.join(ub.RemoteAuthToken)
|
||||||
|
@ -159,7 +167,8 @@ def requires_kobo_auth(f):
|
||||||
)
|
)
|
||||||
if user is not None:
|
if user is not None:
|
||||||
login_user(user)
|
login_user(user)
|
||||||
|
[limiter.limiter.storage.clear(k.key) for k in limiter.current_limits]
|
||||||
return f(*args, **kwargs)
|
return f(*args, **kwargs)
|
||||||
log.debug("Received Kobo request without a recognizable auth token.")
|
log.debug("Received Kobo request without a recognizable auth token.")
|
||||||
return abort(401)
|
return abort(401)
|
||||||
return inner
|
return inner
|
||||||
|
|
|
@ -150,7 +150,7 @@ def setup(log_file, log_level=None):
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
file_handler = RotatingFileHandler(log_file, maxBytes=100000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(log_file, maxBytes=100000, backupCount=2, encoding='utf-8')
|
||||||
except IOError:
|
except (IOError, PermissionError):
|
||||||
if log_file == DEFAULT_LOG_FILE:
|
if log_file == DEFAULT_LOG_FILE:
|
||||||
raise
|
raise
|
||||||
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=100000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=100000, backupCount=2, encoding='utf-8')
|
||||||
|
@ -177,7 +177,7 @@ def create_access_log(log_file, log_name, formatter):
|
||||||
access_log.setLevel(logging.INFO)
|
access_log.setLevel(logging.INFO)
|
||||||
try:
|
try:
|
||||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
except IOError:
|
except (IOError, PermissionError):
|
||||||
if log_file == DEFAULT_ACCESS_LOG:
|
if log_file == DEFAULT_ACCESS_LOG:
|
||||||
raise
|
raise
|
||||||
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
|
|
10
cps/main.py
|
@ -18,9 +18,14 @@
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from . import create_app
|
from . import create_app, limiter
|
||||||
from .jinjia import jinjia
|
from .jinjia import jinjia
|
||||||
from .remotelogin import remotelogin
|
from .remotelogin import remotelogin
|
||||||
|
from flask import request
|
||||||
|
|
||||||
|
|
||||||
|
def request_username():
|
||||||
|
return request.authorization.username
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
app = create_app()
|
app = create_app()
|
||||||
|
@ -39,6 +44,7 @@ def main():
|
||||||
try:
|
try:
|
||||||
from .kobo import kobo, get_kobo_activated
|
from .kobo import kobo, get_kobo_activated
|
||||||
from .kobo_auth import kobo_auth
|
from .kobo_auth import kobo_auth
|
||||||
|
from flask_limiter.util import get_remote_address
|
||||||
kobo_available = get_kobo_activated()
|
kobo_available = get_kobo_activated()
|
||||||
except (ImportError, AttributeError): # Catch also error for not installed flask-WTF (missing csrf decorator)
|
except (ImportError, AttributeError): # Catch also error for not installed flask-WTF (missing csrf decorator)
|
||||||
kobo_available = False
|
kobo_available = False
|
||||||
|
@ -56,6 +62,7 @@ def main():
|
||||||
app.register_blueprint(tasks)
|
app.register_blueprint(tasks)
|
||||||
app.register_blueprint(web)
|
app.register_blueprint(web)
|
||||||
app.register_blueprint(opds)
|
app.register_blueprint(opds)
|
||||||
|
limiter.limit("3/minute",key_func=request_username)(opds)
|
||||||
app.register_blueprint(jinjia)
|
app.register_blueprint(jinjia)
|
||||||
app.register_blueprint(about)
|
app.register_blueprint(about)
|
||||||
app.register_blueprint(shelf)
|
app.register_blueprint(shelf)
|
||||||
|
@ -67,6 +74,7 @@ def main():
|
||||||
if kobo_available:
|
if kobo_available:
|
||||||
app.register_blueprint(kobo)
|
app.register_blueprint(kobo)
|
||||||
app.register_blueprint(kobo_auth)
|
app.register_blueprint(kobo_auth)
|
||||||
|
limiter.limit("3/minute", key_func=get_remote_address)(kobo)
|
||||||
if oauth_available:
|
if oauth_available:
|
||||||
app.register_blueprint(oauth)
|
app.register_blueprint(oauth)
|
||||||
success = web_server.start()
|
success = web_server.start()
|
||||||
|
|
|
@ -63,11 +63,11 @@ class Amazon(Metadata):
|
||||||
r.raise_for_status()
|
r.raise_for_status()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.warning(ex)
|
log.warning(ex)
|
||||||
return
|
return None
|
||||||
long_soup = BS(r.text, "lxml") #~4sec :/
|
long_soup = BS(r.text, "lxml") #~4sec :/
|
||||||
soup2 = long_soup.find("div", attrs={"cel_widget_id": "dpx-books-ppd_csm_instrumentation_wrapper"})
|
soup2 = long_soup.find("div", attrs={"cel_widget_id": "dpx-books-ppd_csm_instrumentation_wrapper"})
|
||||||
if soup2 is None:
|
if soup2 is None:
|
||||||
return
|
return None
|
||||||
try:
|
try:
|
||||||
match = MetaRecord(
|
match = MetaRecord(
|
||||||
title = "",
|
title = "",
|
||||||
|
@ -98,7 +98,7 @@ class Amazon(Metadata):
|
||||||
try:
|
try:
|
||||||
match.authors = [next(
|
match.authors = [next(
|
||||||
filter(lambda i: i != " " and i != "\n" and not i.startswith("{"),
|
filter(lambda i: i != " " and i != "\n" and not i.startswith("{"),
|
||||||
x.findAll(text=True))).strip()
|
x.findAll(string=True))).strip()
|
||||||
for x in soup2.findAll("span", attrs={"class": "author"})]
|
for x in soup2.findAll("span", attrs={"class": "author"})]
|
||||||
except (AttributeError, TypeError, StopIteration):
|
except (AttributeError, TypeError, StopIteration):
|
||||||
match.authors = ""
|
match.authors = ""
|
||||||
|
@ -115,7 +115,7 @@ class Amazon(Metadata):
|
||||||
return match, index
|
return match, index
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.error_or_exception(e)
|
log.error_or_exception(e)
|
||||||
return
|
return None
|
||||||
|
|
||||||
val = list()
|
val = list()
|
||||||
if self.active:
|
if self.active:
|
||||||
|
@ -127,10 +127,10 @@ class Amazon(Metadata):
|
||||||
results.raise_for_status()
|
results.raise_for_status()
|
||||||
except requests.exceptions.HTTPError as e:
|
except requests.exceptions.HTTPError as e:
|
||||||
log.error_or_exception(e)
|
log.error_or_exception(e)
|
||||||
return None
|
return []
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning(e)
|
log.warning(e)
|
||||||
return None
|
return []
|
||||||
soup = BS(results.text, 'html.parser')
|
soup = BS(results.text, 'html.parser')
|
||||||
links_list = [next(filter(lambda i: "digital-text" in i["href"], x.findAll("a")))["href"] for x in
|
links_list = [next(filter(lambda i: "digital-text" in i["href"], x.findAll("a")))["href"] for x in
|
||||||
soup.findAll("div", attrs={"data-component-type": "s-search-result"})]
|
soup.findAll("div", attrs={"data-component-type": "s-search-result"})]
|
||||||
|
|
|
@ -43,7 +43,8 @@ class Douban(Metadata):
|
||||||
__id__ = "douban"
|
__id__ = "douban"
|
||||||
DESCRIPTION = "豆瓣"
|
DESCRIPTION = "豆瓣"
|
||||||
META_URL = "https://book.douban.com/"
|
META_URL = "https://book.douban.com/"
|
||||||
SEARCH_URL = "https://www.douban.com/j/search"
|
SEARCH_JSON_URL = "https://www.douban.com/j/search"
|
||||||
|
SEARCH_URL = "https://www.douban.com/search"
|
||||||
|
|
||||||
ID_PATTERN = re.compile(r"sid: (?P<id>\d+),")
|
ID_PATTERN = re.compile(r"sid: (?P<id>\d+),")
|
||||||
AUTHORS_PATTERN = re.compile(r"作者|译者")
|
AUTHORS_PATTERN = re.compile(r"作者|译者")
|
||||||
|
@ -52,6 +53,7 @@ class Douban(Metadata):
|
||||||
PUBLISHED_DATE_PATTERN = re.compile(r"出版年")
|
PUBLISHED_DATE_PATTERN = re.compile(r"出版年")
|
||||||
SERIES_PATTERN = re.compile(r"丛书")
|
SERIES_PATTERN = re.compile(r"丛书")
|
||||||
IDENTIFIERS_PATTERN = re.compile(r"ISBN|统一书号")
|
IDENTIFIERS_PATTERN = re.compile(r"ISBN|统一书号")
|
||||||
|
CRITERIA_PATTERN = re.compile("criteria = '(.+)'")
|
||||||
|
|
||||||
TITTLE_XPATH = "//span[@property='v:itemreviewed']"
|
TITTLE_XPATH = "//span[@property='v:itemreviewed']"
|
||||||
COVER_XPATH = "//a[@class='nbg']"
|
COVER_XPATH = "//a[@class='nbg']"
|
||||||
|
@ -63,56 +65,90 @@ class Douban(Metadata):
|
||||||
session = requests.Session()
|
session = requests.Session()
|
||||||
session.headers = {
|
session.headers = {
|
||||||
'user-agent':
|
'user-agent':
|
||||||
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36 Edg/98.0.1108.56',
|
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36 Edg/98.0.1108.56',
|
||||||
}
|
}
|
||||||
|
|
||||||
def search(
|
def search(self,
|
||||||
self, query: str, generic_cover: str = "", locale: str = "en"
|
query: str,
|
||||||
) -> Optional[List[MetaRecord]]:
|
generic_cover: str = "",
|
||||||
|
locale: str = "en") -> List[MetaRecord]:
|
||||||
|
val = []
|
||||||
if self.active:
|
if self.active:
|
||||||
log.debug(f"starting search {query} on douban")
|
log.debug(f"start searching {query} on douban")
|
||||||
if title_tokens := list(
|
if title_tokens := list(
|
||||||
self.get_title_tokens(query, strip_joiners=False)
|
self.get_title_tokens(query, strip_joiners=False)):
|
||||||
):
|
|
||||||
query = "+".join(title_tokens)
|
query = "+".join(title_tokens)
|
||||||
|
|
||||||
try:
|
book_id_list = self._get_book_id_list_from_html(query)
|
||||||
r = self.session.get(
|
|
||||||
self.SEARCH_URL, params={"cat": 1001, "q": query}
|
|
||||||
)
|
|
||||||
r.raise_for_status()
|
|
||||||
|
|
||||||
except Exception as e:
|
if not book_id_list:
|
||||||
log.warning(e)
|
log.debug("No search results in Douban")
|
||||||
return None
|
|
||||||
|
|
||||||
results = r.json()
|
|
||||||
if results["total"] == 0:
|
|
||||||
return []
|
return []
|
||||||
|
|
||||||
book_id_list = [
|
with futures.ThreadPoolExecutor(
|
||||||
self.ID_PATTERN.search(item).group("id")
|
max_workers=5, thread_name_prefix='douban') as executor:
|
||||||
for item in results["items"][:10] if self.ID_PATTERN.search(item)
|
|
||||||
]
|
|
||||||
|
|
||||||
with futures.ThreadPoolExecutor(max_workers=5) as executor:
|
|
||||||
|
|
||||||
fut = [
|
fut = [
|
||||||
executor.submit(self._parse_single_book, book_id, generic_cover)
|
executor.submit(self._parse_single_book, book_id,
|
||||||
for book_id in book_id_list
|
generic_cover) for book_id in book_id_list
|
||||||
]
|
]
|
||||||
|
|
||||||
val = [
|
val = [
|
||||||
future.result()
|
future.result() for future in futures.as_completed(fut)
|
||||||
for future in futures.as_completed(fut) if future.result()
|
if future.result()
|
||||||
]
|
]
|
||||||
|
|
||||||
return val
|
return val
|
||||||
|
|
||||||
def _parse_single_book(
|
def _get_book_id_list_from_html(self, query: str) -> List[str]:
|
||||||
self, id: str, generic_cover: str = ""
|
try:
|
||||||
) -> Optional[MetaRecord]:
|
r = self.session.get(self.SEARCH_URL,
|
||||||
|
params={
|
||||||
|
"cat": 1001,
|
||||||
|
"q": query
|
||||||
|
})
|
||||||
|
r.raise_for_status()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return []
|
||||||
|
|
||||||
|
html = etree.HTML(r.content.decode("utf8"))
|
||||||
|
result_list = html.xpath(self.COVER_XPATH)
|
||||||
|
|
||||||
|
return [
|
||||||
|
self.ID_PATTERN.search(item.get("onclick")).group("id")
|
||||||
|
for item in result_list[:10]
|
||||||
|
if self.ID_PATTERN.search(item.get("onclick"))
|
||||||
|
]
|
||||||
|
|
||||||
|
def _get_book_id_list_from_json(self, query: str) -> List[str]:
|
||||||
|
try:
|
||||||
|
r = self.session.get(self.SEARCH_JSON_URL,
|
||||||
|
params={
|
||||||
|
"cat": 1001,
|
||||||
|
"q": query
|
||||||
|
})
|
||||||
|
r.raise_for_status()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return []
|
||||||
|
|
||||||
|
results = r.json()
|
||||||
|
if results["total"] == 0:
|
||||||
|
return []
|
||||||
|
|
||||||
|
return [
|
||||||
|
self.ID_PATTERN.search(item).group("id")
|
||||||
|
for item in results["items"][:10] if self.ID_PATTERN.search(item)
|
||||||
|
]
|
||||||
|
|
||||||
|
def _parse_single_book(self,
|
||||||
|
id: str,
|
||||||
|
generic_cover: str = "") -> Optional[MetaRecord]:
|
||||||
url = f"https://book.douban.com/subject/{id}/"
|
url = f"https://book.douban.com/subject/{id}/"
|
||||||
|
log.debug(f"start parsing {url}")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
r = self.session.get(url)
|
r = self.session.get(url)
|
||||||
|
@ -133,10 +169,12 @@ class Douban(Metadata):
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
html = etree.HTML(r.content.decode("utf8"))
|
decode_content = r.content.decode("utf8")
|
||||||
|
html = etree.HTML(decode_content)
|
||||||
|
|
||||||
match.title = html.xpath(self.TITTLE_XPATH)[0].text
|
match.title = html.xpath(self.TITTLE_XPATH)[0].text
|
||||||
match.cover = html.xpath(self.COVER_XPATH)[0].attrib["href"] or generic_cover
|
match.cover = html.xpath(
|
||||||
|
self.COVER_XPATH)[0].attrib["href"] or generic_cover
|
||||||
try:
|
try:
|
||||||
rating_num = float(html.xpath(self.RATING_XPATH)[0].text.strip())
|
rating_num = float(html.xpath(self.RATING_XPATH)[0].text.strip())
|
||||||
except Exception:
|
except Exception:
|
||||||
|
@ -146,35 +184,39 @@ class Douban(Metadata):
|
||||||
tag_elements = html.xpath(self.TAGS_XPATH)
|
tag_elements = html.xpath(self.TAGS_XPATH)
|
||||||
if len(tag_elements):
|
if len(tag_elements):
|
||||||
match.tags = [tag_element.text for tag_element in tag_elements]
|
match.tags = [tag_element.text for tag_element in tag_elements]
|
||||||
|
else:
|
||||||
|
match.tags = self._get_tags(decode_content)
|
||||||
|
|
||||||
description_element = html.xpath(self.DESCRIPTION_XPATH)
|
description_element = html.xpath(self.DESCRIPTION_XPATH)
|
||||||
if len(description_element):
|
if len(description_element):
|
||||||
match.description = html2text(etree.tostring(
|
match.description = html2text(
|
||||||
description_element[-1], encoding="utf8").decode("utf8"))
|
etree.tostring(description_element[-1]).decode("utf8"))
|
||||||
|
|
||||||
info = html.xpath(self.INFO_XPATH)
|
info = html.xpath(self.INFO_XPATH)
|
||||||
|
|
||||||
for element in info:
|
for element in info:
|
||||||
text = element.text
|
text = element.text
|
||||||
if self.AUTHORS_PATTERN.search(text):
|
if self.AUTHORS_PATTERN.search(text):
|
||||||
next = element.getnext()
|
next_element = element.getnext()
|
||||||
while next is not None and next.tag != "br":
|
while next_element is not None and next_element.tag != "br":
|
||||||
match.authors.append(next.text)
|
match.authors.append(next_element.text)
|
||||||
next = next.getnext()
|
next_element = next_element.getnext()
|
||||||
elif self.PUBLISHER_PATTERN.search(text):
|
elif self.PUBLISHER_PATTERN.search(text):
|
||||||
match.publisher = element.tail.strip()
|
if publisher := element.tail.strip():
|
||||||
|
match.publisher = publisher
|
||||||
|
else:
|
||||||
|
match.publisher = element.getnext().text
|
||||||
elif self.SUBTITLE_PATTERN.search(text):
|
elif self.SUBTITLE_PATTERN.search(text):
|
||||||
match.title = f'{match.title}:' + element.tail.strip()
|
match.title = f'{match.title}:{element.tail.strip()}'
|
||||||
elif self.PUBLISHED_DATE_PATTERN.search(text):
|
elif self.PUBLISHED_DATE_PATTERN.search(text):
|
||||||
match.publishedDate = self._clean_date(element.tail.strip())
|
match.publishedDate = self._clean_date(element.tail.strip())
|
||||||
elif self.SUBTITLE_PATTERN.search(text):
|
elif self.SERIES_PATTERN.search(text):
|
||||||
match.series = element.getnext().text
|
match.series = element.getnext().text
|
||||||
elif i_type := self.IDENTIFIERS_PATTERN.search(text):
|
elif i_type := self.IDENTIFIERS_PATTERN.search(text):
|
||||||
match.identifiers[i_type.group()] = element.tail.strip()
|
match.identifiers[i_type.group()] = element.tail.strip()
|
||||||
|
|
||||||
return match
|
return match
|
||||||
|
|
||||||
|
|
||||||
def _clean_date(self, date: str) -> str:
|
def _clean_date(self, date: str) -> str:
|
||||||
"""
|
"""
|
||||||
Clean up the date string to be in the format YYYY-MM-DD
|
Clean up the date string to be in the format YYYY-MM-DD
|
||||||
|
@ -194,13 +236,24 @@ class Douban(Metadata):
|
||||||
if date[i].isdigit():
|
if date[i].isdigit():
|
||||||
digit.append(date[i])
|
digit.append(date[i])
|
||||||
elif digit:
|
elif digit:
|
||||||
ls.append("".join(digit) if len(digit)==2 else f"0{digit[0]}")
|
ls.append("".join(digit) if len(digit) ==
|
||||||
|
2 else f"0{digit[0]}")
|
||||||
digit = []
|
digit = []
|
||||||
if digit:
|
if digit:
|
||||||
ls.append("".join(digit) if len(digit)==2 else f"0{digit[0]}")
|
ls.append("".join(digit) if len(digit) ==
|
||||||
|
2 else f"0{digit[0]}")
|
||||||
|
|
||||||
moon = ls[0]
|
moon = ls[0]
|
||||||
if len(ls)>1:
|
if len(ls) > 1:
|
||||||
day = ls[1]
|
day = ls[1]
|
||||||
|
|
||||||
return f"{year}-{moon}-{day}"
|
return f"{year}-{moon}-{day}"
|
||||||
|
|
||||||
|
def _get_tags(self, text: str) -> List[str]:
|
||||||
|
tags = []
|
||||||
|
if criteria := self.CRITERIA_PATTERN.search(text):
|
||||||
|
tags.extend(
|
||||||
|
item.replace('7:', '') for item in criteria.group().split('|')
|
||||||
|
if item.startswith('7:'))
|
||||||
|
|
||||||
|
return tags
|
||||||
|
|
|
@ -19,6 +19,7 @@
|
||||||
# Google Books api document: https://developers.google.com/books/docs/v1/using
|
# Google Books api document: https://developers.google.com/books/docs/v1/using
|
||||||
from typing import Dict, List, Optional
|
from typing import Dict, List, Optional
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
@ -81,7 +82,11 @@ class Google(Metadata):
|
||||||
match.description = result["volumeInfo"].get("description", "")
|
match.description = result["volumeInfo"].get("description", "")
|
||||||
match.languages = self._parse_languages(result=result, locale=locale)
|
match.languages = self._parse_languages(result=result, locale=locale)
|
||||||
match.publisher = result["volumeInfo"].get("publisher", "")
|
match.publisher = result["volumeInfo"].get("publisher", "")
|
||||||
match.publishedDate = result["volumeInfo"].get("publishedDate", "")
|
try:
|
||||||
|
datetime.strptime(result["volumeInfo"].get("publishedDate", ""), "%Y-%m-%d")
|
||||||
|
match.publishedDate = result["volumeInfo"].get("publishedDate", "")
|
||||||
|
except ValueError:
|
||||||
|
match.publishedDate = ""
|
||||||
match.rating = result["volumeInfo"].get("averageRating", 0)
|
match.rating = result["volumeInfo"].get("averageRating", 0)
|
||||||
match.series, match.series_index = "", 1
|
match.series, match.series_index = "", 1
|
||||||
match.tags = result["volumeInfo"].get("categories", [])
|
match.tags = result["volumeInfo"].get("categories", [])
|
||||||
|
@ -103,6 +108,13 @@ class Google(Metadata):
|
||||||
def _parse_cover(result: Dict, generic_cover: str) -> str:
|
def _parse_cover(result: Dict, generic_cover: str) -> str:
|
||||||
if result["volumeInfo"].get("imageLinks"):
|
if result["volumeInfo"].get("imageLinks"):
|
||||||
cover_url = result["volumeInfo"]["imageLinks"]["thumbnail"]
|
cover_url = result["volumeInfo"]["imageLinks"]["thumbnail"]
|
||||||
|
|
||||||
|
# strip curl in cover
|
||||||
|
cover_url = cover_url.replace("&edge=curl", "")
|
||||||
|
|
||||||
|
# request 800x900 cover image (higher resolution)
|
||||||
|
cover_url += "&fife=w800-h900"
|
||||||
|
|
||||||
return cover_url.replace("http://", "https://")
|
return cover_url.replace("http://", "https://")
|
||||||
return generic_cover
|
return generic_cover
|
||||||
|
|
||||||
|
|
|
@ -97,12 +97,14 @@ class LubimyCzytac(Metadata):
|
||||||
LANGUAGES = f"{CONTAINER}//dt[contains(text(),'Język:')]{SIBLINGS}/text()"
|
LANGUAGES = f"{CONTAINER}//dt[contains(text(),'Język:')]{SIBLINGS}/text()"
|
||||||
DESCRIPTION = f"{CONTAINER}//div[@class='collapse-content']"
|
DESCRIPTION = f"{CONTAINER}//div[@class='collapse-content']"
|
||||||
SERIES = f"{CONTAINER}//span/a[contains(@href,'/cykl/')]/text()"
|
SERIES = f"{CONTAINER}//span/a[contains(@href,'/cykl/')]/text()"
|
||||||
|
TRANSLATOR = f"{CONTAINER}//dt[contains(text(),'Tłumacz:')]{SIBLINGS}/a/text()"
|
||||||
|
|
||||||
DETAILS = "//div[@id='book-details']"
|
DETAILS = "//div[@id='book-details']"
|
||||||
PUBLISH_DATE = "//dt[contains(@title,'Data pierwszego wydania"
|
PUBLISH_DATE = "//dt[contains(@title,'Data pierwszego wydania"
|
||||||
FIRST_PUBLISH_DATE = f"{DETAILS}{PUBLISH_DATE} oryginalnego')]{SIBLINGS}[1]/text()"
|
FIRST_PUBLISH_DATE = f"{DETAILS}{PUBLISH_DATE} oryginalnego')]{SIBLINGS}[1]/text()"
|
||||||
FIRST_PUBLISH_DATE_PL = f"{DETAILS}{PUBLISH_DATE} polskiego')]{SIBLINGS}[1]/text()"
|
FIRST_PUBLISH_DATE_PL = f"{DETAILS}{PUBLISH_DATE} polskiego')]{SIBLINGS}[1]/text()"
|
||||||
TAGS = "//nav[@aria-label='breadcrumb']//a[contains(@href,'/ksiazki/k/')]/text()"
|
TAGS = "//a[contains(@href,'/ksiazki/t/')]/text()" # "//nav[@aria-label='breadcrumbs']//a[contains(@href,'/ksiazki/k/')]/span/text()"
|
||||||
|
|
||||||
|
|
||||||
RATING = "//meta[@property='books:rating:value']/@content"
|
RATING = "//meta[@property='books:rating:value']/@content"
|
||||||
COVER = "//meta[@property='og:image']/@content"
|
COVER = "//meta[@property='og:image']/@content"
|
||||||
|
@ -158,6 +160,7 @@ class LubimyCzytac(Metadata):
|
||||||
|
|
||||||
class LubimyCzytacParser:
|
class LubimyCzytacParser:
|
||||||
PAGES_TEMPLATE = "<p id='strony'>Książka ma {0} stron(y).</p>"
|
PAGES_TEMPLATE = "<p id='strony'>Książka ma {0} stron(y).</p>"
|
||||||
|
TRANSLATOR_TEMPLATE = "<p id='translator'>Tłumacz: {0}</p>"
|
||||||
PUBLISH_DATE_TEMPLATE = "<p id='pierwsze_wydanie'>Data pierwszego wydania: {0}</p>"
|
PUBLISH_DATE_TEMPLATE = "<p id='pierwsze_wydanie'>Data pierwszego wydania: {0}</p>"
|
||||||
PUBLISH_DATE_PL_TEMPLATE = (
|
PUBLISH_DATE_PL_TEMPLATE = (
|
||||||
"<p id='pierwsze_wydanie'>Data pierwszego wydania w Polsce: {0}</p>"
|
"<p id='pierwsze_wydanie'>Data pierwszego wydania w Polsce: {0}</p>"
|
||||||
|
@ -346,5 +349,9 @@ class LubimyCzytacParser:
|
||||||
description += LubimyCzytacParser.PUBLISH_DATE_PL_TEMPLATE.format(
|
description += LubimyCzytacParser.PUBLISH_DATE_PL_TEMPLATE.format(
|
||||||
first_publish_date_pl.strftime("%d.%m.%Y")
|
first_publish_date_pl.strftime("%d.%m.%Y")
|
||||||
)
|
)
|
||||||
|
translator = self._parse_xpath_node(xpath=LubimyCzytac.TRANSLATOR)
|
||||||
|
if translator:
|
||||||
|
description += LubimyCzytacParser.TRANSLATOR_TEMPLATE.format(translator)
|
||||||
|
|
||||||
|
|
||||||
return description
|
return description
|
||||||
|
|
|
@ -54,7 +54,7 @@ class scholar(Metadata):
|
||||||
scholar_gen = itertools.islice(scholarly.search_pubs(query), 10)
|
scholar_gen = itertools.islice(scholarly.search_pubs(query), 10)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.warning(e)
|
log.warning(e)
|
||||||
return None
|
return list()
|
||||||
for result in scholar_gen:
|
for result in scholar_gen:
|
||||||
match = self._parse_search_result(
|
match = self._parse_search_result(
|
||||||
result=result, generic_cover="", locale=locale
|
result=result, generic_cover="", locale=locale
|
||||||
|
|
|
@ -74,7 +74,7 @@ def register_user_with_oauth(user=None):
|
||||||
if len(all_oauth.keys()) == 0:
|
if len(all_oauth.keys()) == 0:
|
||||||
return
|
return
|
||||||
if user is None:
|
if user is None:
|
||||||
flash(_(u"Register with %(provider)s", provider=", ".join(list(all_oauth.values()))), category="success")
|
flash(_("Register with %(provider)s", provider=", ".join(list(all_oauth.values()))), category="success")
|
||||||
else:
|
else:
|
||||||
for oauth_key in all_oauth.keys():
|
for oauth_key in all_oauth.keys():
|
||||||
# Find this OAuth token in the database, or create it
|
# Find this OAuth token in the database, or create it
|
||||||
|
@ -134,8 +134,8 @@ def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider
|
||||||
# already bind with user, just login
|
# already bind with user, just login
|
||||||
if oauth_entry.user:
|
if oauth_entry.user:
|
||||||
login_user(oauth_entry.user)
|
login_user(oauth_entry.user)
|
||||||
log.debug(u"You are now logged in as: '%s'", oauth_entry.user.name)
|
log.debug("You are now logged in as: '%s'", oauth_entry.user.name)
|
||||||
flash(_(u"you are now logged in as: '%(nickname)s'", nickname= oauth_entry.user.name),
|
flash(_("Success! You are now logged in as: %(nickname)s", nickname= oauth_entry.user.name),
|
||||||
category="success")
|
category="success")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
else:
|
else:
|
||||||
|
@ -145,21 +145,21 @@ def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider
|
||||||
try:
|
try:
|
||||||
ub.session.add(oauth_entry)
|
ub.session.add(oauth_entry)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Link to %(oauth)s Succeeded", oauth=provider_name), category="success")
|
flash(_("Link to %(oauth)s Succeeded", oauth=provider_name), category="success")
|
||||||
log.info("Link to {} Succeeded".format(provider_name))
|
log.info("Link to {} Succeeded".format(provider_name))
|
||||||
return redirect(url_for('web.profile'))
|
return redirect(url_for('web.profile'))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
else:
|
else:
|
||||||
flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
|
flash(_("Login failed, No User Linked With OAuth Account"), category="error")
|
||||||
log.info('Login failed, No User Linked With OAuth Account')
|
log.info('Login failed, No User Linked With OAuth Account')
|
||||||
return redirect(url_for('web.login'))
|
return redirect(url_for('web.login'))
|
||||||
# return redirect(url_for('web.login'))
|
# return redirect(url_for('web.login'))
|
||||||
# if config.config_public_reg:
|
# if config.config_public_reg:
|
||||||
# return redirect(url_for('web.register'))
|
# return redirect(url_for('web.register'))
|
||||||
# else:
|
# else:
|
||||||
# flash(_(u"Public registration is not enabled"), category="error")
|
# flash(_("Public registration is not enabled"), category="error")
|
||||||
# return redirect(url_for(redirect_url))
|
# return redirect(url_for(redirect_url))
|
||||||
except (NoResultFound, AttributeError):
|
except (NoResultFound, AttributeError):
|
||||||
return redirect(url_for(redirect_url))
|
return redirect(url_for(redirect_url))
|
||||||
|
@ -194,15 +194,15 @@ def unlink_oauth(provider):
|
||||||
ub.session.delete(oauth_entry)
|
ub.session.delete(oauth_entry)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
logout_oauth_user()
|
logout_oauth_user()
|
||||||
flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
flash(_("Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
||||||
log.info("Unlink to {} Succeeded".format(oauth_check[provider]))
|
log.info("Unlink to {} Succeeded".format(oauth_check[provider]))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
flash(_("Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
||||||
except NoResultFound:
|
except NoResultFound:
|
||||||
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
||||||
flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
|
flash(_("Not Linked to %(oauth)s", oauth=provider), category="error")
|
||||||
return redirect(url_for('web.profile'))
|
return redirect(url_for('web.profile'))
|
||||||
|
|
||||||
def generate_oauth_blueprints():
|
def generate_oauth_blueprints():
|
||||||
|
@ -258,13 +258,13 @@ if ub.oauth_support:
|
||||||
@oauth_authorized.connect_via(oauthblueprints[0]['blueprint'])
|
@oauth_authorized.connect_via(oauthblueprints[0]['blueprint'])
|
||||||
def github_logged_in(blueprint, token):
|
def github_logged_in(blueprint, token):
|
||||||
if not token:
|
if not token:
|
||||||
flash(_(u"Failed to log in with GitHub."), category="error")
|
flash(_("Failed to log in with GitHub."), category="error")
|
||||||
log.error("Failed to log in with GitHub")
|
log.error("Failed to log in with GitHub")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
resp = blueprint.session.get("/user")
|
resp = blueprint.session.get("/user")
|
||||||
if not resp.ok:
|
if not resp.ok:
|
||||||
flash(_(u"Failed to fetch user info from GitHub."), category="error")
|
flash(_("Failed to fetch user info from GitHub."), category="error")
|
||||||
log.error("Failed to fetch user info from GitHub")
|
log.error("Failed to fetch user info from GitHub")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@ -276,13 +276,13 @@ if ub.oauth_support:
|
||||||
@oauth_authorized.connect_via(oauthblueprints[1]['blueprint'])
|
@oauth_authorized.connect_via(oauthblueprints[1]['blueprint'])
|
||||||
def google_logged_in(blueprint, token):
|
def google_logged_in(blueprint, token):
|
||||||
if not token:
|
if not token:
|
||||||
flash(_(u"Failed to log in with Google."), category="error")
|
flash(_("Failed to log in with Google."), category="error")
|
||||||
log.error("Failed to log in with Google")
|
log.error("Failed to log in with Google")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
resp = blueprint.session.get("/oauth2/v2/userinfo")
|
resp = blueprint.session.get("/oauth2/v2/userinfo")
|
||||||
if not resp.ok:
|
if not resp.ok:
|
||||||
flash(_(u"Failed to fetch user info from Google."), category="error")
|
flash(_("Failed to fetch user info from Google."), category="error")
|
||||||
log.error("Failed to fetch user info from Google")
|
log.error("Failed to fetch user info from Google")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
@ -295,8 +295,8 @@ if ub.oauth_support:
|
||||||
@oauth_error.connect_via(oauthblueprints[0]['blueprint'])
|
@oauth_error.connect_via(oauthblueprints[0]['blueprint'])
|
||||||
def github_error(blueprint, error, error_description=None, error_uri=None):
|
def github_error(blueprint, error, error_description=None, error_uri=None):
|
||||||
msg = (
|
msg = (
|
||||||
u"OAuth error from {name}! "
|
"OAuth error from {name}! "
|
||||||
u"error={error} description={description} uri={uri}"
|
"error={error} description={description} uri={uri}"
|
||||||
).format(
|
).format(
|
||||||
name=blueprint.name,
|
name=blueprint.name,
|
||||||
error=error,
|
error=error,
|
||||||
|
@ -308,8 +308,8 @@ if ub.oauth_support:
|
||||||
@oauth_error.connect_via(oauthblueprints[1]['blueprint'])
|
@oauth_error.connect_via(oauthblueprints[1]['blueprint'])
|
||||||
def google_error(blueprint, error, error_description=None, error_uri=None):
|
def google_error(blueprint, error, error_description=None, error_uri=None):
|
||||||
msg = (
|
msg = (
|
||||||
u"OAuth error from {name}! "
|
"OAuth error from {name}! "
|
||||||
u"error={error} description={description} uri={uri}"
|
"error={error} description={description} uri={uri}"
|
||||||
).format(
|
).format(
|
||||||
name=blueprint.name,
|
name=blueprint.name,
|
||||||
error=error,
|
error=error,
|
||||||
|
@ -329,10 +329,10 @@ def github_login():
|
||||||
if account_info.ok:
|
if account_info.ok:
|
||||||
account_info_json = account_info.json()
|
account_info_json = account_info.json()
|
||||||
return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login', 'github')
|
return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login', 'github')
|
||||||
flash(_(u"GitHub Oauth error, please retry later."), category="error")
|
flash(_("GitHub Oauth error, please retry later."), category="error")
|
||||||
log.error("GitHub Oauth error, please retry later")
|
log.error("GitHub Oauth error, please retry later")
|
||||||
except (InvalidGrantError, TokenExpiredError) as e:
|
except (InvalidGrantError, TokenExpiredError) as e:
|
||||||
flash(_(u"GitHub Oauth error: {}").format(e), category="error")
|
flash(_("GitHub Oauth error: {}").format(e), category="error")
|
||||||
log.error(e)
|
log.error(e)
|
||||||
return redirect(url_for('web.login'))
|
return redirect(url_for('web.login'))
|
||||||
|
|
||||||
|
@ -353,10 +353,10 @@ def google_login():
|
||||||
if resp.ok:
|
if resp.ok:
|
||||||
account_info_json = resp.json()
|
account_info_json = resp.json()
|
||||||
return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login', 'google')
|
return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login', 'google')
|
||||||
flash(_(u"Google Oauth error, please retry later."), category="error")
|
flash(_("Google Oauth error, please retry later."), category="error")
|
||||||
log.error("Google Oauth error, please retry later")
|
log.error("Google Oauth error, please retry later")
|
||||||
except (InvalidGrantError, TokenExpiredError) as e:
|
except (InvalidGrantError, TokenExpiredError) as e:
|
||||||
flash(_(u"Google Oauth error: {}").format(e), category="error")
|
flash(_("Google Oauth error: {}").format(e), category="error")
|
||||||
log.error(e)
|
log.error(e)
|
||||||
return redirect(url_for('web.login'))
|
return redirect(url_for('web.login'))
|
||||||
|
|
||||||
|
|
111
cps/opds.py
|
@ -21,41 +21,28 @@
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
import datetime
|
import datetime
|
||||||
|
import json
|
||||||
from urllib.parse import unquote_plus
|
from urllib.parse import unquote_plus
|
||||||
from functools import wraps
|
|
||||||
|
|
||||||
from flask import Blueprint, request, render_template, Response, g, make_response, abort
|
from flask import Blueprint, request, render_template, make_response, abort, Response, g
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from flask_babel import get_locale
|
from flask_babel import get_locale
|
||||||
|
from flask_babel import gettext as _
|
||||||
from sqlalchemy.sql.expression import func, text, or_, and_, true
|
from sqlalchemy.sql.expression import func, text, or_, and_, true
|
||||||
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
||||||
from werkzeug.security import check_password_hash
|
|
||||||
|
|
||||||
from . import constants, logger, config, db, calibre_db, ub, services, isoLanguages
|
from . import logger, config, db, calibre_db, ub, isoLanguages, constants
|
||||||
|
from .usermanagement import requires_basic_auth_if_no_ano
|
||||||
from .helper import get_download_link, get_book_cover
|
from .helper import get_download_link, get_book_cover
|
||||||
from .pagination import Pagination
|
from .pagination import Pagination
|
||||||
from .web import render_read_books
|
from .web import render_read_books
|
||||||
from .usermanagement import load_user_from_request
|
|
||||||
from flask_babel import gettext as _
|
|
||||||
|
|
||||||
opds = Blueprint('opds', __name__)
|
opds = Blueprint('opds', __name__)
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
def requires_basic_auth_if_no_ano(f):
|
|
||||||
@wraps(f)
|
|
||||||
def decorated(*args, **kwargs):
|
|
||||||
auth = request.authorization
|
|
||||||
if config.config_anonbrowse != 1:
|
|
||||||
if not auth or auth.type != 'basic' or not check_auth(auth.username, auth.password):
|
|
||||||
return authenticate()
|
|
||||||
return f(*args, **kwargs)
|
|
||||||
if config.config_login_type == constants.LOGIN_LDAP and services.ldap and config.config_anonbrowse != 1:
|
|
||||||
return services.ldap.basic_auth_required(f)
|
|
||||||
return decorated
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/")
|
@opds.route("/opds/")
|
||||||
@opds.route("/opds")
|
@opds.route("/opds")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
|
@ -69,7 +56,7 @@ def feed_osd():
|
||||||
return render_xml_template('osd.xml', lang='en-EN')
|
return render_xml_template('osd.xml', lang='en-EN')
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/search", defaults={'query': ""})
|
# @opds.route("/opds/search", defaults={'query': ""})
|
||||||
@opds.route("/opds/search/<path:query>")
|
@opds.route("/opds/search/<path:query>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_cc_search(query):
|
def feed_cc_search(query):
|
||||||
|
@ -107,6 +94,8 @@ def feed_letter_books(book_id):
|
||||||
@opds.route("/opds/new")
|
@opds.route("/opds/new")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_new():
|
def feed_new():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_RECENT):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books, True, [db.Books.timestamp.desc()],
|
db.Books, True, [db.Books.timestamp.desc()],
|
||||||
|
@ -117,6 +106,8 @@ def feed_new():
|
||||||
@opds.route("/opds/discover")
|
@opds.route("/opds/discover")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_discover():
|
def feed_discover():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_RANDOM):
|
||||||
|
abort(404)
|
||||||
query = calibre_db.generate_linked_query(config.config_read_column, db.Books)
|
query = calibre_db.generate_linked_query(config.config_read_column, db.Books)
|
||||||
entries = query.filter(calibre_db.common_filters()).order_by(func.random()).limit(config.config_books_per_page)
|
entries = query.filter(calibre_db.common_filters()).order_by(func.random()).limit(config.config_books_per_page)
|
||||||
pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page))
|
pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page))
|
||||||
|
@ -126,6 +117,8 @@ def feed_discover():
|
||||||
@opds.route("/opds/rated")
|
@opds.route("/opds/rated")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_best_rated():
|
def feed_best_rated():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_BEST_RATED):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books, db.Books.ratings.any(db.Ratings.rating > 9),
|
db.Books, db.Books.ratings.any(db.Ratings.rating > 9),
|
||||||
|
@ -137,6 +130,8 @@ def feed_best_rated():
|
||||||
@opds.route("/opds/hot")
|
@opds.route("/opds/hot")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_hot():
|
def feed_hot():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_HOT):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
all_books = ub.session.query(ub.Downloads, func.count(ub.Downloads.book_id)).order_by(
|
all_books = ub.session.query(ub.Downloads, func.count(ub.Downloads.book_id)).order_by(
|
||||||
func.count(ub.Downloads.book_id).desc()).group_by(ub.Downloads.book_id)
|
func.count(ub.Downloads.book_id).desc()).group_by(ub.Downloads.book_id)
|
||||||
|
@ -159,12 +154,16 @@ def feed_hot():
|
||||||
@opds.route("/opds/author")
|
@opds.route("/opds/author")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_authorindex():
|
def feed_authorindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_AUTHOR):
|
||||||
|
abort(404)
|
||||||
return render_element_index(db.Authors.sort, db.books_authors_link, 'opds.feed_letter_author')
|
return render_element_index(db.Authors.sort, db.books_authors_link, 'opds.feed_letter_author')
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/author/letter/<book_id>")
|
@opds.route("/opds/author/letter/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_letter_author(book_id):
|
def feed_letter_author(book_id):
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_AUTHOR):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
letter = true() if book_id == "00" else func.upper(db.Authors.sort).startswith(book_id)
|
letter = true() if book_id == "00" else func.upper(db.Authors.sort).startswith(book_id)
|
||||||
entries = calibre_db.session.query(db.Authors).join(db.books_authors_link).join(db.Books)\
|
entries = calibre_db.session.query(db.Authors).join(db.books_authors_link).join(db.Books)\
|
||||||
|
@ -186,6 +185,8 @@ def feed_author(book_id):
|
||||||
@opds.route("/opds/publisher")
|
@opds.route("/opds/publisher")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_publisherindex():
|
def feed_publisherindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_PUBLISHER):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries = calibre_db.session.query(db.Publishers)\
|
entries = calibre_db.session.query(db.Publishers)\
|
||||||
.join(db.books_publishers_link)\
|
.join(db.books_publishers_link)\
|
||||||
|
@ -207,12 +208,16 @@ def feed_publisher(book_id):
|
||||||
@opds.route("/opds/category")
|
@opds.route("/opds/category")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_categoryindex():
|
def feed_categoryindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_CATEGORY):
|
||||||
|
abort(404)
|
||||||
return render_element_index(db.Tags.name, db.books_tags_link, 'opds.feed_letter_category')
|
return render_element_index(db.Tags.name, db.books_tags_link, 'opds.feed_letter_category')
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/category/letter/<book_id>")
|
@opds.route("/opds/category/letter/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_letter_category(book_id):
|
def feed_letter_category(book_id):
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_CATEGORY):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
letter = true() if book_id == "00" else func.upper(db.Tags.name).startswith(book_id)
|
letter = true() if book_id == "00" else func.upper(db.Tags.name).startswith(book_id)
|
||||||
entries = calibre_db.session.query(db.Tags)\
|
entries = calibre_db.session.query(db.Tags)\
|
||||||
|
@ -236,12 +241,16 @@ def feed_category(book_id):
|
||||||
@opds.route("/opds/series")
|
@opds.route("/opds/series")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_seriesindex():
|
def feed_seriesindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_SERIES):
|
||||||
|
abort(404)
|
||||||
return render_element_index(db.Series.sort, db.books_series_link, 'opds.feed_letter_series')
|
return render_element_index(db.Series.sort, db.books_series_link, 'opds.feed_letter_series')
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/series/letter/<book_id>")
|
@opds.route("/opds/series/letter/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_letter_series(book_id):
|
def feed_letter_series(book_id):
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_SERIES):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
letter = true() if book_id == "00" else func.upper(db.Series.sort).startswith(book_id)
|
letter = true() if book_id == "00" else func.upper(db.Series.sort).startswith(book_id)
|
||||||
entries = calibre_db.session.query(db.Series)\
|
entries = calibre_db.session.query(db.Series)\
|
||||||
|
@ -271,6 +280,8 @@ def feed_series(book_id):
|
||||||
@opds.route("/opds/ratings")
|
@opds.route("/opds/ratings")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_ratingindex():
|
def feed_ratingindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_RATING):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries = calibre_db.session.query(db.Ratings, func.count('books_ratings_link.book').label('count'),
|
entries = calibre_db.session.query(db.Ratings, func.count('books_ratings_link.book').label('count'),
|
||||||
(db.Ratings.rating / 2).label('name')) \
|
(db.Ratings.rating / 2).label('name')) \
|
||||||
|
@ -297,6 +308,8 @@ def feed_ratings(book_id):
|
||||||
@opds.route("/opds/formats")
|
@opds.route("/opds/formats")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_formatindex():
|
def feed_formatindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_FORMAT):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries = calibre_db.session.query(db.Data).join(db.Books)\
|
entries = calibre_db.session.query(db.Data).join(db.Books)\
|
||||||
.filter(calibre_db.common_filters()) \
|
.filter(calibre_db.common_filters()) \
|
||||||
|
@ -304,7 +317,6 @@ def feed_formatindex():
|
||||||
.order_by(db.Data.format).all()
|
.order_by(db.Data.format).all()
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
len(entries))
|
len(entries))
|
||||||
|
|
||||||
element = list()
|
element = list()
|
||||||
for entry in entries:
|
for entry in entries:
|
||||||
element.append(FeedObject(entry.format, entry.format))
|
element.append(FeedObject(entry.format, entry.format))
|
||||||
|
@ -327,8 +339,10 @@ def feed_format(book_id):
|
||||||
@opds.route("/opds/language/")
|
@opds.route("/opds/language/")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_languagesindex():
|
def feed_languagesindex():
|
||||||
|
if not current_user.check_visibility(constants.SIDEBAR_LANGUAGE):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
if current_user.filter_language() == u"all":
|
if current_user.filter_language() == "all":
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
else:
|
else:
|
||||||
languages = calibre_db.session.query(db.Languages).filter(
|
languages = calibre_db.session.query(db.Languages).filter(
|
||||||
|
@ -354,8 +368,11 @@ def feed_languages(book_id):
|
||||||
@opds.route("/opds/shelfindex")
|
@opds.route("/opds/shelfindex")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_shelfindex():
|
def feed_shelfindex():
|
||||||
|
if not (current_user.is_authenticated or g.allow_anonymous):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
shelf = g.shelves_access
|
shelf = ub.session.query(ub.Shelf).filter(
|
||||||
|
or_(ub.Shelf.is_public == 1, ub.Shelf.user_id == current_user.id)).order_by(ub.Shelf.name).all()
|
||||||
number = len(shelf)
|
number = len(shelf)
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
number)
|
number)
|
||||||
|
@ -365,6 +382,8 @@ def feed_shelfindex():
|
||||||
@opds.route("/opds/shelf/<int:book_id>")
|
@opds.route("/opds/shelf/<int:book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_shelf(book_id):
|
def feed_shelf(book_id):
|
||||||
|
if not (current_user.is_authenticated or g.allow_anonymous):
|
||||||
|
abort(404)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
if current_user.is_anonymous:
|
if current_user.is_anonymous:
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1,
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1,
|
||||||
|
@ -402,11 +421,7 @@ def feed_shelf(book_id):
|
||||||
@opds.route("/opds/download/<book_id>/<book_format>/")
|
@opds.route("/opds/download/<book_id>/<book_format>/")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def opds_download_link(book_id, book_format):
|
def opds_download_link(book_id, book_format):
|
||||||
# I gave up with this: With enabled ldap login, the user doesn't get logged in, therefore it's always guest
|
if not current_user.role_download():
|
||||||
# workaround, loading the user from the request and checking its download rights here
|
|
||||||
# in case of anonymous browsing user is None
|
|
||||||
user = load_user_from_request(request) or current_user
|
|
||||||
if not user.role_download():
|
|
||||||
return abort(403)
|
return abort(403)
|
||||||
if "Kobo" in request.headers.get('User-Agent'):
|
if "Kobo" in request.headers.get('User-Agent'):
|
||||||
client = "kobo"
|
client = "kobo"
|
||||||
|
@ -429,6 +444,17 @@ def get_metadata_calibre_companion(uuid, library):
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
@opds.route("/opds/stats")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def get_database_stats():
|
||||||
|
stat = dict()
|
||||||
|
stat['books'] = calibre_db.session.query(db.Books).count()
|
||||||
|
stat['authors'] = calibre_db.session.query(db.Authors).count()
|
||||||
|
stat['categories'] = calibre_db.session.query(db.Tags).count()
|
||||||
|
stat['series'] = calibre_db.session.query(db.Series).count()
|
||||||
|
return Response(json.dumps(stat), mimetype="application/json")
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/thumb_240_240/<book_id>")
|
@opds.route("/opds/thumb_240_240/<book_id>")
|
||||||
@opds.route("/opds/cover_240_240/<book_id>")
|
@opds.route("/opds/cover_240_240/<book_id>")
|
||||||
@opds.route("/opds/cover_90_90/<book_id>")
|
@opds.route("/opds/cover_90_90/<book_id>")
|
||||||
|
@ -441,6 +467,8 @@ def feed_get_cover(book_id):
|
||||||
@opds.route("/opds/readbooks")
|
@opds.route("/opds/readbooks")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_read_books():
|
def feed_read_books():
|
||||||
|
if not (current_user.check_visibility(constants.SIDEBAR_READ_AND_UNREAD) and not current_user.is_anonymous):
|
||||||
|
return abort(403)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, True, True)
|
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, True, True)
|
||||||
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
||||||
|
@ -449,6 +477,8 @@ def feed_read_books():
|
||||||
@opds.route("/opds/unreadbooks")
|
@opds.route("/opds/unreadbooks")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_unread_books():
|
def feed_unread_books():
|
||||||
|
if not (current_user.check_visibility(constants.SIDEBAR_READ_AND_UNREAD) and not current_user.is_anonymous):
|
||||||
|
return abort(403)
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True)
|
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True)
|
||||||
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
||||||
|
@ -478,32 +508,11 @@ def feed_search(term):
|
||||||
return render_xml_template('feed.xml', searchterm="")
|
return render_xml_template('feed.xml', searchterm="")
|
||||||
|
|
||||||
|
|
||||||
def check_auth(username, password):
|
|
||||||
try:
|
|
||||||
username = username.encode('windows-1252')
|
|
||||||
except UnicodeEncodeError:
|
|
||||||
username = username.encode('utf-8')
|
|
||||||
user = ub.session.query(ub.User).filter(func.lower(ub.User.name) ==
|
|
||||||
username.decode('utf-8').lower()).first()
|
|
||||||
if bool(user and check_password_hash(str(user.password), password)):
|
|
||||||
return True
|
|
||||||
else:
|
|
||||||
ip_address = request.headers.get('X-Forwarded-For', request.remote_addr)
|
|
||||||
log.warning('OPDS Login failed for user "%s" IP-address: %s', username.decode('utf-8'), ip_address)
|
|
||||||
return False
|
|
||||||
|
|
||||||
|
|
||||||
def authenticate():
|
|
||||||
return Response(
|
|
||||||
'Could not verify your access level for that URL.\n'
|
|
||||||
'You have to login with proper credentials', 401,
|
|
||||||
{'WWW-Authenticate': 'Basic realm="Login Required"'})
|
|
||||||
|
|
||||||
|
|
||||||
def render_xml_template(*args, **kwargs):
|
def render_xml_template(*args, **kwargs):
|
||||||
# ToDo: return time in current timezone similar to %z
|
# ToDo: return time in current timezone similar to %z
|
||||||
currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00")
|
currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00")
|
||||||
xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, *args, **kwargs)
|
xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, constants=constants.sidebar_settings, *args, **kwargs)
|
||||||
response = make_response(xml)
|
response = make_response(xml)
|
||||||
response.headers["Content-Type"] = "application/atom+xml; charset=utf-8"
|
response.headers["Content-Type"] = "application/atom+xml; charset=utf-8"
|
||||||
return response
|
return response
|
||||||
|
@ -528,7 +537,7 @@ def render_element_index(database_column, linked_table, folder):
|
||||||
entries = entries.join(linked_table).join(db.Books)
|
entries = entries.join(linked_table).join(db.Books)
|
||||||
entries = entries.filter(calibre_db.common_filters()).group_by(func.upper(func.substr(database_column, 1, 1))).all()
|
entries = entries.filter(calibre_db.common_filters()).group_by(func.upper(func.substr(database_column, 1, 1))).all()
|
||||||
elements = []
|
elements = []
|
||||||
if off == 0:
|
if off == 0 and entries:
|
||||||
elements.append({'id': "00", 'name': _("All")})
|
elements.append({'id': "00", 'name': _("All")})
|
||||||
shift = 1
|
shift = 1
|
||||||
for entry in entries[
|
for entry in entries[
|
||||||
|
|
|
@ -29,7 +29,7 @@
|
||||||
|
|
||||||
from urllib.parse import urlparse, urljoin
|
from urllib.parse import urlparse, urljoin
|
||||||
|
|
||||||
from flask import request, url_for, redirect
|
from flask import request, url_for, redirect, current_app
|
||||||
|
|
||||||
|
|
||||||
def is_safe_url(target):
|
def is_safe_url(target):
|
||||||
|
@ -38,16 +38,15 @@ def is_safe_url(target):
|
||||||
return test_url.scheme in ('http', 'https') and ref_url.netloc == test_url.netloc
|
return test_url.scheme in ('http', 'https') and ref_url.netloc == test_url.netloc
|
||||||
|
|
||||||
|
|
||||||
def get_redirect_target():
|
def remove_prefix(text, prefix):
|
||||||
for target in request.values.get('next'), request.referrer:
|
if text.startswith(prefix):
|
||||||
if not target:
|
return text[len(prefix):]
|
||||||
continue
|
return ""
|
||||||
if is_safe_url(target):
|
|
||||||
return target
|
|
||||||
|
|
||||||
|
|
||||||
def redirect_back(endpoint, **values):
|
def get_redirect_location(next, endpoint, **values):
|
||||||
target = request.form['next']
|
target = next or url_for(endpoint, **values)
|
||||||
if not target or not is_safe_url(target):
|
adapter = current_app.url_map.bind(urlparse(request.host_url).netloc)
|
||||||
|
if not len(adapter.allowed_methods(remove_prefix(target, request.environ.get('HTTP_X_SCRIPT_NAME',"")))):
|
||||||
target = url_for(endpoint, **values)
|
target = url_for(endpoint, **values)
|
||||||
return redirect(target)
|
return target
|
||||||
|
|
|
@ -58,8 +58,8 @@ def remote_login():
|
||||||
ub.session.add(auth_token)
|
ub.session.add(auth_token)
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
verify_url = url_for('remotelogin.verify_token', token=auth_token.auth_token, _external=true)
|
verify_url = url_for('remotelogin.verify_token', token=auth_token.auth_token, _external=true)
|
||||||
log.debug(u"Remot Login request with token: %s", auth_token.auth_token)
|
log.debug("Remot Login request with token: %s", auth_token.auth_token)
|
||||||
return render_title_template('remote_login.html', title=_(u"Login"), token=auth_token.auth_token,
|
return render_title_template('remote_login.html', title=_("Login"), token=auth_token.auth_token,
|
||||||
verify_url=verify_url, page="remotelogin")
|
verify_url=verify_url, page="remotelogin")
|
||||||
|
|
||||||
|
|
||||||
|
@ -71,8 +71,8 @@ def verify_token(token):
|
||||||
|
|
||||||
# Token not found
|
# Token not found
|
||||||
if auth_token is None:
|
if auth_token is None:
|
||||||
flash(_(u"Token not found"), category="error")
|
flash(_("Token not found"), category="error")
|
||||||
log.error(u"Remote Login token not found")
|
log.error("Remote Login token not found")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
# Token expired
|
# Token expired
|
||||||
|
@ -80,8 +80,8 @@ def verify_token(token):
|
||||||
ub.session.delete(auth_token)
|
ub.session.delete(auth_token)
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
|
|
||||||
flash(_(u"Token has expired"), category="error")
|
flash(_("Token has expired"), category="error")
|
||||||
log.error(u"Remote Login token expired")
|
log.error("Remote Login token expired")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
# Update token with user information
|
# Update token with user information
|
||||||
|
@ -89,8 +89,8 @@ def verify_token(token):
|
||||||
auth_token.verified = True
|
auth_token.verified = True
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
|
|
||||||
flash(_(u"Success! Please return to your device"), category="success")
|
flash(_("Success! Please return to your device"), category="success")
|
||||||
log.debug(u"Remote Login token for userid %s verified", auth_token.user_id)
|
log.debug("Remote Login token for userid %s verified", auth_token.user_id)
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -105,7 +105,7 @@ def token_verified():
|
||||||
# Token not found
|
# Token not found
|
||||||
if auth_token is None:
|
if auth_token is None:
|
||||||
data['status'] = 'error'
|
data['status'] = 'error'
|
||||||
data['message'] = _(u"Token not found")
|
data['message'] = _("Token not found")
|
||||||
|
|
||||||
# Token expired
|
# Token expired
|
||||||
elif datetime.now() > auth_token.expiration:
|
elif datetime.now() > auth_token.expiration:
|
||||||
|
@ -113,7 +113,7 @@ def token_verified():
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
|
|
||||||
data['status'] = 'error'
|
data['status'] = 'error'
|
||||||
data['message'] = _(u"Token has expired")
|
data['message'] = _("Token has expired")
|
||||||
|
|
||||||
elif not auth_token.verified:
|
elif not auth_token.verified:
|
||||||
data['status'] = 'not_verified'
|
data['status'] = 'not_verified'
|
||||||
|
@ -126,8 +126,8 @@ def token_verified():
|
||||||
ub.session_commit("User {} logged in via remotelogin, token deleted".format(user.name))
|
ub.session_commit("User {} logged in via remotelogin, token deleted".format(user.name))
|
||||||
|
|
||||||
data['status'] = 'success'
|
data['status'] = 'success'
|
||||||
log.debug(u"Remote Login for userid %s succeeded", user.id)
|
log.debug("Remote Login for userid %s succeeded", user.id)
|
||||||
flash(_(u"you are now logged in as: '%(nickname)s'", nickname=user.name), category="success")
|
flash(_("Success! You are now logged in as: %(nickname)s", nickname=user.name), category="success")
|
||||||
|
|
||||||
response = make_response(json.dumps(data, ensure_ascii=False))
|
response = make_response(json.dumps(data, ensure_ascii=False))
|
||||||
response.headers["Content-Type"] = "application/json; charset=utf-8"
|
response.headers["Content-Type"] = "application/json; charset=utf-8"
|
||||||
|
|
|
@ -20,11 +20,13 @@ from flask import render_template, g, abort, request
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from werkzeug.local import LocalProxy
|
from werkzeug.local import LocalProxy
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
|
from sqlalchemy.sql.expression import or_
|
||||||
|
|
||||||
from . import config, constants, logger
|
from . import config, constants, logger, ub
|
||||||
from .ub import User
|
from .ub import User
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
def get_sidebar_config(kwargs=None):
|
def get_sidebar_config(kwargs=None):
|
||||||
|
@ -45,12 +47,12 @@ def get_sidebar_config(kwargs=None):
|
||||||
"show_text": _('Show Hot Books'), "config_show": True})
|
"show_text": _('Show Hot Books'), "config_show": True})
|
||||||
if current_user.role_admin():
|
if current_user.role_admin():
|
||||||
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.download_list',
|
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.download_list',
|
||||||
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not g.user.is_anonymous),
|
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not current_user.is_anonymous),
|
||||||
"page": "download", "show_text": _('Show Downloaded Books'),
|
"page": "download", "show_text": _('Show Downloaded Books'),
|
||||||
"config_show": content})
|
"config_show": content})
|
||||||
else:
|
else:
|
||||||
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.books_list',
|
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.books_list',
|
||||||
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not g.user.is_anonymous),
|
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not current_user.is_anonymous),
|
||||||
"page": "download", "show_text": _('Show Downloaded Books'),
|
"page": "download", "show_text": _('Show Downloaded Books'),
|
||||||
"config_show": content})
|
"config_show": content})
|
||||||
sidebar.append(
|
sidebar.append(
|
||||||
|
@ -58,47 +60,50 @@ def get_sidebar_config(kwargs=None):
|
||||||
"visibility": constants.SIDEBAR_BEST_RATED, 'public': True, "page": "rated",
|
"visibility": constants.SIDEBAR_BEST_RATED, 'public': True, "page": "rated",
|
||||||
"show_text": _('Show Top Rated Books'), "config_show": True})
|
"show_text": _('Show Top Rated Books'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-eye-open", "text": _('Read Books'), "link": 'web.books_list', "id": "read",
|
sidebar.append({"glyph": "glyphicon-eye-open", "text": _('Read Books'), "link": 'web.books_list', "id": "read",
|
||||||
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not g.user.is_anonymous),
|
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not current_user.is_anonymous),
|
||||||
"page": "read", "show_text": _('Show read and unread'), "config_show": content})
|
"page": "read", "show_text": _('Show Read and Unread'), "config_show": content})
|
||||||
sidebar.append(
|
sidebar.append(
|
||||||
{"glyph": "glyphicon-eye-close", "text": _('Unread Books'), "link": 'web.books_list', "id": "unread",
|
{"glyph": "glyphicon-eye-close", "text": _('Unread Books'), "link": 'web.books_list', "id": "unread",
|
||||||
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not g.user.is_anonymous), "page": "unread",
|
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not current_user.is_anonymous), "page": "unread",
|
||||||
"show_text": _('Show unread'), "config_show": False})
|
"show_text": _('Show unread'), "config_show": False})
|
||||||
sidebar.append({"glyph": "glyphicon-random", "text": _('Discover'), "link": 'web.books_list', "id": "rand",
|
sidebar.append({"glyph": "glyphicon-random", "text": _('Discover'), "link": 'web.books_list', "id": "rand",
|
||||||
"visibility": constants.SIDEBAR_RANDOM, 'public': True, "page": "discover",
|
"visibility": constants.SIDEBAR_RANDOM, 'public': True, "page": "discover",
|
||||||
"show_text": _('Show Random Books'), "config_show": True})
|
"show_text": _('Show Random Books'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-inbox", "text": _('Categories'), "link": 'web.category_list', "id": "cat",
|
sidebar.append({"glyph": "glyphicon-inbox", "text": _('Categories'), "link": 'web.category_list', "id": "cat",
|
||||||
"visibility": constants.SIDEBAR_CATEGORY, 'public': True, "page": "category",
|
"visibility": constants.SIDEBAR_CATEGORY, 'public': True, "page": "category",
|
||||||
"show_text": _('Show category selection'), "config_show": True})
|
"show_text": _('Show Category Section'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-bookmark", "text": _('Series'), "link": 'web.series_list', "id": "serie",
|
sidebar.append({"glyph": "glyphicon-bookmark", "text": _('Series'), "link": 'web.series_list', "id": "serie",
|
||||||
"visibility": constants.SIDEBAR_SERIES, 'public': True, "page": "series",
|
"visibility": constants.SIDEBAR_SERIES, 'public': True, "page": "series",
|
||||||
"show_text": _('Show series selection'), "config_show": True})
|
"show_text": _('Show Series Section'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-user", "text": _('Authors'), "link": 'web.author_list', "id": "author",
|
sidebar.append({"glyph": "glyphicon-user", "text": _('Authors'), "link": 'web.author_list', "id": "author",
|
||||||
"visibility": constants.SIDEBAR_AUTHOR, 'public': True, "page": "author",
|
"visibility": constants.SIDEBAR_AUTHOR, 'public': True, "page": "author",
|
||||||
"show_text": _('Show author selection'), "config_show": True})
|
"show_text": _('Show Author Section'), "config_show": True})
|
||||||
sidebar.append(
|
sidebar.append(
|
||||||
{"glyph": "glyphicon-text-size", "text": _('Publishers'), "link": 'web.publisher_list', "id": "publisher",
|
{"glyph": "glyphicon-text-size", "text": _('Publishers'), "link": 'web.publisher_list', "id": "publisher",
|
||||||
"visibility": constants.SIDEBAR_PUBLISHER, 'public': True, "page": "publisher",
|
"visibility": constants.SIDEBAR_PUBLISHER, 'public': True, "page": "publisher",
|
||||||
"show_text": _('Show publisher selection'), "config_show":True})
|
"show_text": _('Show Publisher Section'), "config_show":True})
|
||||||
sidebar.append({"glyph": "glyphicon-flag", "text": _('Languages'), "link": 'web.language_overview', "id": "lang",
|
sidebar.append({"glyph": "glyphicon-flag", "text": _('Languages'), "link": 'web.language_overview', "id": "lang",
|
||||||
"visibility": constants.SIDEBAR_LANGUAGE, 'public': (g.user.filter_language() == 'all'),
|
"visibility": constants.SIDEBAR_LANGUAGE, 'public': (current_user.filter_language() == 'all'),
|
||||||
"page": "language",
|
"page": "language",
|
||||||
"show_text": _('Show language selection'), "config_show": True})
|
"show_text": _('Show Language Section'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-star-empty", "text": _('Ratings'), "link": 'web.ratings_list', "id": "rate",
|
sidebar.append({"glyph": "glyphicon-star-empty", "text": _('Ratings'), "link": 'web.ratings_list', "id": "rate",
|
||||||
"visibility": constants.SIDEBAR_RATING, 'public': True,
|
"visibility": constants.SIDEBAR_RATING, 'public': True,
|
||||||
"page": "rating", "show_text": _('Show ratings selection'), "config_show": True})
|
"page": "rating", "show_text": _('Show Ratings Section'), "config_show": True})
|
||||||
sidebar.append({"glyph": "glyphicon-file", "text": _('File formats'), "link": 'web.formats_list', "id": "format",
|
sidebar.append({"glyph": "glyphicon-file", "text": _('File formats'), "link": 'web.formats_list', "id": "format",
|
||||||
"visibility": constants.SIDEBAR_FORMAT, 'public': True,
|
"visibility": constants.SIDEBAR_FORMAT, 'public': True,
|
||||||
"page": "format", "show_text": _('Show file formats selection'), "config_show": True})
|
"page": "format", "show_text": _('Show File Formats Section'), "config_show": True})
|
||||||
sidebar.append(
|
sidebar.append(
|
||||||
{"glyph": "glyphicon-trash", "text": _('Archived Books'), "link": 'web.books_list', "id": "archived",
|
{"glyph": "glyphicon-trash", "text": _('Archived Books'), "link": 'web.books_list', "id": "archived",
|
||||||
"visibility": constants.SIDEBAR_ARCHIVED, 'public': (not g.user.is_anonymous), "page": "archived",
|
"visibility": constants.SIDEBAR_ARCHIVED, 'public': (not current_user.is_anonymous), "page": "archived",
|
||||||
"show_text": _('Show archived books'), "config_show": content})
|
"show_text": _('Show Archived Books'), "config_show": content})
|
||||||
if not simple:
|
if not simple:
|
||||||
sidebar.append(
|
sidebar.append(
|
||||||
{"glyph": "glyphicon-th-list", "text": _('Books List'), "link": 'web.books_table', "id": "list",
|
{"glyph": "glyphicon-th-list", "text": _('Books List'), "link": 'web.books_table', "id": "list",
|
||||||
"visibility": constants.SIDEBAR_LIST, 'public': (not g.user.is_anonymous), "page": "list",
|
"visibility": constants.SIDEBAR_LIST, 'public': (not current_user.is_anonymous), "page": "list",
|
||||||
"show_text": _('Show Books List'), "config_show": content})
|
"show_text": _('Show Books List'), "config_show": content})
|
||||||
|
g.shelves_access = ub.session.query(ub.Shelf).filter(
|
||||||
|
or_(ub.Shelf.is_public == 1, ub.Shelf.user_id == current_user.id)).order_by(ub.Shelf.name).all()
|
||||||
|
|
||||||
return sidebar, simple
|
return sidebar, simple
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -19,19 +19,26 @@
|
||||||
import datetime
|
import datetime
|
||||||
|
|
||||||
from . import config, constants
|
from . import config, constants
|
||||||
from .services.background_scheduler import BackgroundScheduler, use_APScheduler
|
from .services.background_scheduler import BackgroundScheduler, CronTrigger, use_APScheduler
|
||||||
from .tasks.database import TaskReconnectDatabase
|
from .tasks.database import TaskReconnectDatabase
|
||||||
|
from .tasks.tempFolder import TaskDeleteTempFolder
|
||||||
from .tasks.thumbnail import TaskGenerateCoverThumbnails, TaskGenerateSeriesThumbnails, TaskClearCoverThumbnailCache
|
from .tasks.thumbnail import TaskGenerateCoverThumbnails, TaskGenerateSeriesThumbnails, TaskClearCoverThumbnailCache
|
||||||
from .services.worker import WorkerThread
|
from .services.worker import WorkerThread
|
||||||
|
from .tasks.metadata_backup import TaskBackupMetadata
|
||||||
|
|
||||||
def get_scheduled_tasks(reconnect=True):
|
def get_scheduled_tasks(reconnect=True):
|
||||||
tasks = list()
|
tasks = list()
|
||||||
# config.schedule_reconnect or
|
# Reconnect Calibre database (metadata.db) based on config.schedule_reconnect
|
||||||
# Reconnect Calibre database (metadata.db)
|
|
||||||
if reconnect:
|
if reconnect:
|
||||||
tasks.append([lambda: TaskReconnectDatabase(), 'reconnect', False])
|
tasks.append([lambda: TaskReconnectDatabase(), 'reconnect', False])
|
||||||
|
|
||||||
|
# Delete temp folder
|
||||||
|
tasks.append([lambda: TaskDeleteTempFolder(), 'delete temp', True])
|
||||||
|
|
||||||
|
# Generate metadata.opf file for each changed book
|
||||||
|
if config.schedule_metadata_backup:
|
||||||
|
tasks.append([lambda: TaskBackupMetadata("en"), 'backup metadata', False])
|
||||||
|
|
||||||
# Generate all missing book cover thumbnails
|
# Generate all missing book cover thumbnails
|
||||||
if config.schedule_generate_book_covers:
|
if config.schedule_generate_book_covers:
|
||||||
tasks.append([lambda: TaskClearCoverThumbnailCache(0), 'delete superfluous book covers', True])
|
tasks.append([lambda: TaskClearCoverThumbnailCache(0), 'delete superfluous book covers', True])
|
||||||
|
@ -62,10 +69,13 @@ def register_scheduled_tasks(reconnect=True):
|
||||||
duration = config.schedule_duration
|
duration = config.schedule_duration
|
||||||
|
|
||||||
# Register scheduled tasks
|
# Register scheduled tasks
|
||||||
scheduler.schedule_tasks(tasks=get_scheduled_tasks(reconnect), trigger='cron', hour=start)
|
timezone_info = datetime.datetime.now(datetime.timezone.utc).astimezone().tzinfo
|
||||||
|
scheduler.schedule_tasks(tasks=get_scheduled_tasks(reconnect), trigger=CronTrigger(hour=start,
|
||||||
|
timezone=timezone_info))
|
||||||
end_time = calclulate_end_time(start, duration)
|
end_time = calclulate_end_time(start, duration)
|
||||||
scheduler.schedule(func=end_scheduled_tasks, trigger='cron', name="end scheduled task", hour=end_time.hour,
|
scheduler.schedule(func=end_scheduled_tasks, trigger=CronTrigger(hour=end_time.hour, minute=end_time.minute,
|
||||||
minute=end_time.minute)
|
timezone=timezone_info),
|
||||||
|
name="end scheduled task")
|
||||||
|
|
||||||
# Kick-off tasks, if they should currently be running
|
# Kick-off tasks, if they should currently be running
|
||||||
if should_task_be_running(start, duration):
|
if should_task_be_running(start, duration):
|
||||||
|
@ -83,6 +93,8 @@ def register_startup_tasks():
|
||||||
# Ignore tasks that should currently be running, as these will be added when registering scheduled tasks
|
# Ignore tasks that should currently be running, as these will be added when registering scheduled tasks
|
||||||
if constants.APP_MODE in ['development', 'test'] and not should_task_be_running(start, duration):
|
if constants.APP_MODE in ['development', 'test'] and not should_task_be_running(start, duration):
|
||||||
scheduler.schedule_tasks_immediately(tasks=get_scheduled_tasks(False))
|
scheduler.schedule_tasks_immediately(tasks=get_scheduled_tasks(False))
|
||||||
|
else:
|
||||||
|
scheduler.schedule_tasks_immediately(tasks=[[lambda: TaskDeleteTempFolder(), 'delete temp', True]])
|
||||||
|
|
||||||
|
|
||||||
def should_task_be_running(start, duration):
|
def should_task_be_running(start, duration):
|
||||||
|
@ -91,6 +103,7 @@ def should_task_be_running(start, duration):
|
||||||
end_time = start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
end_time = start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
||||||
return start_time < now < end_time
|
return start_time < now < end_time
|
||||||
|
|
||||||
|
|
||||||
def calclulate_end_time(start, duration):
|
def calclulate_end_time(start, duration):
|
||||||
start_time = datetime.datetime.now().replace(hour=start, minute=0)
|
start_time = datetime.datetime.now().replace(hour=start, minute=0)
|
||||||
return start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
return start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
||||||
|
|
|
@ -45,7 +45,7 @@ def simple_search():
|
||||||
return render_title_template('search.html',
|
return render_title_template('search.html',
|
||||||
searchterm="",
|
searchterm="",
|
||||||
result_count=0,
|
result_count=0,
|
||||||
title=_(u"Search"),
|
title=_("Search"),
|
||||||
page="search")
|
page="search")
|
||||||
|
|
||||||
|
|
||||||
|
@ -185,18 +185,18 @@ def extend_search_term(searchterm,
|
||||||
searchterm.extend((author_name.replace('|', ','), book_title, publisher))
|
searchterm.extend((author_name.replace('|', ','), book_title, publisher))
|
||||||
if pub_start:
|
if pub_start:
|
||||||
try:
|
try:
|
||||||
searchterm.extend([_(u"Published after ") +
|
searchterm.extend([_("Published after ") +
|
||||||
format_date(datetime.strptime(pub_start, "%Y-%m-%d"),
|
format_date(datetime.strptime(pub_start, "%Y-%m-%d"),
|
||||||
format='medium')])
|
format='medium')])
|
||||||
except ValueError:
|
except ValueError:
|
||||||
pub_start = u""
|
pub_start = ""
|
||||||
if pub_end:
|
if pub_end:
|
||||||
try:
|
try:
|
||||||
searchterm.extend([_(u"Published before ") +
|
searchterm.extend([_("Published before ") +
|
||||||
format_date(datetime.strptime(pub_end, "%Y-%m-%d"),
|
format_date(datetime.strptime(pub_end, "%Y-%m-%d"),
|
||||||
format='medium')])
|
format='medium')])
|
||||||
except ValueError:
|
except ValueError:
|
||||||
pub_end = u""
|
pub_end = ""
|
||||||
elements = {'tag': db.Tags, 'serie':db.Series, 'shelf':ub.Shelf}
|
elements = {'tag': db.Tags, 'serie':db.Series, 'shelf':ub.Shelf}
|
||||||
for key, db_element in elements.items():
|
for key, db_element in elements.items():
|
||||||
tag_names = calibre_db.session.query(db_element).filter(db_element.id.in_(tags['include_' + key])).all()
|
tag_names = calibre_db.session.query(db_element).filter(db_element.id.in_(tags['include_' + key])).all()
|
||||||
|
@ -214,11 +214,11 @@ def extend_search_term(searchterm,
|
||||||
language_names = calibre_db.speaking_language(language_names)
|
language_names = calibre_db.speaking_language(language_names)
|
||||||
searchterm.extend(language.name for language in language_names)
|
searchterm.extend(language.name for language in language_names)
|
||||||
if rating_high:
|
if rating_high:
|
||||||
searchterm.extend([_(u"Rating <= %(rating)s", rating=rating_high)])
|
searchterm.extend([_("Rating <= %(rating)s", rating=rating_high)])
|
||||||
if rating_low:
|
if rating_low:
|
||||||
searchterm.extend([_(u"Rating >= %(rating)s", rating=rating_low)])
|
searchterm.extend([_("Rating >= %(rating)s", rating=rating_low)])
|
||||||
if read_status:
|
if read_status != "Any":
|
||||||
searchterm.extend([_(u"Read Status = %(status)s", status=read_status)])
|
searchterm.extend([_("Read Status = '%(status)s'", status=read_status)])
|
||||||
searchterm.extend(ext for ext in tags['include_extension'])
|
searchterm.extend(ext for ext in tags['include_extension'])
|
||||||
searchterm.extend(ext for ext in tags['exclude_extension'])
|
searchterm.extend(ext for ext in tags['exclude_extension'])
|
||||||
# handle custom columns
|
# handle custom columns
|
||||||
|
@ -267,23 +267,23 @@ def render_adv_search_results(term, offset=None, order=None, limit=None):
|
||||||
column_start = term.get('custom_column_' + str(c.id) + '_start')
|
column_start = term.get('custom_column_' + str(c.id) + '_start')
|
||||||
column_end = term.get('custom_column_' + str(c.id) + '_end')
|
column_end = term.get('custom_column_' + str(c.id) + '_end')
|
||||||
if column_start:
|
if column_start:
|
||||||
search_term.extend([u"{} >= {}".format(c.name,
|
search_term.extend(["{} >= {}".format(c.name,
|
||||||
format_date(datetime.strptime(column_start, "%Y-%m-%d").date(),
|
format_date(datetime.strptime(column_start, "%Y-%m-%d").date(),
|
||||||
format='medium')
|
format='medium')
|
||||||
)])
|
)])
|
||||||
cc_present = True
|
cc_present = True
|
||||||
if column_end:
|
if column_end:
|
||||||
search_term.extend([u"{} <= {}".format(c.name,
|
search_term.extend(["{} <= {}".format(c.name,
|
||||||
format_date(datetime.strptime(column_end, "%Y-%m-%d").date(),
|
format_date(datetime.strptime(column_end, "%Y-%m-%d").date(),
|
||||||
format='medium')
|
format='medium')
|
||||||
)])
|
)])
|
||||||
cc_present = True
|
cc_present = True
|
||||||
elif term.get('custom_column_' + str(c.id)):
|
elif term.get('custom_column_' + str(c.id)):
|
||||||
search_term.extend([(u"{}: {}".format(c.name, term.get('custom_column_' + str(c.id))))])
|
search_term.extend([("{}: {}".format(c.name, term.get('custom_column_' + str(c.id))))])
|
||||||
cc_present = True
|
cc_present = True
|
||||||
|
|
||||||
if any(tags.values()) or author_name or book_title or publisher or pub_start or pub_end or rating_low \
|
if any(tags.values()) or author_name or book_title or publisher or pub_start or pub_end or rating_low \
|
||||||
or rating_high or description or cc_present or read_status:
|
or rating_high or description or cc_present or read_status != "Any":
|
||||||
search_term, pub_start, pub_end = extend_search_term(search_term,
|
search_term, pub_start, pub_end = extend_search_term(search_term,
|
||||||
author_name,
|
author_name,
|
||||||
book_title,
|
book_title,
|
||||||
|
@ -302,7 +302,8 @@ def render_adv_search_results(term, offset=None, order=None, limit=None):
|
||||||
q = q.filter(func.datetime(db.Books.pubdate) > func.datetime(pub_start))
|
q = q.filter(func.datetime(db.Books.pubdate) > func.datetime(pub_start))
|
||||||
if pub_end:
|
if pub_end:
|
||||||
q = q.filter(func.datetime(db.Books.pubdate) < func.datetime(pub_end))
|
q = q.filter(func.datetime(db.Books.pubdate) < func.datetime(pub_end))
|
||||||
q = q.filter(adv_search_read_status(read_status))
|
if read_status != "Any":
|
||||||
|
q = q.filter(adv_search_read_status(read_status))
|
||||||
if publisher:
|
if publisher:
|
||||||
q = q.filter(db.Books.publishers.any(func.lower(db.Publishers.name).ilike("%" + publisher + "%")))
|
q = q.filter(db.Books.publishers.any(func.lower(db.Publishers.name).ilike("%" + publisher + "%")))
|
||||||
q = adv_search_tag(q, tags['include_tag'], tags['exclude_tag'])
|
q = adv_search_tag(q, tags['include_tag'], tags['exclude_tag'])
|
||||||
|
@ -339,7 +340,7 @@ def render_adv_search_results(term, offset=None, order=None, limit=None):
|
||||||
pagination=pagination,
|
pagination=pagination,
|
||||||
entries=entries,
|
entries=entries,
|
||||||
result_count=result_count,
|
result_count=result_count,
|
||||||
title=_(u"Advanced Search"), page="advsearch",
|
title=_("Advanced Search"), page="advsearch",
|
||||||
order=order[1])
|
order=order[1])
|
||||||
|
|
||||||
|
|
||||||
|
@ -366,22 +367,28 @@ def render_prepare_search_form(cc):
|
||||||
.filter(calibre_db.common_filters()) \
|
.filter(calibre_db.common_filters()) \
|
||||||
.group_by(db.Data.format)\
|
.group_by(db.Data.format)\
|
||||||
.order_by(db.Data.format).all()
|
.order_by(db.Data.format).all()
|
||||||
if current_user.filter_language() == u"all":
|
if current_user.filter_language() == "all":
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
else:
|
else:
|
||||||
languages = None
|
languages = None
|
||||||
return render_title_template('search_form.html', tags=tags, languages=languages, extensions=extensions,
|
return render_title_template('search_form.html', tags=tags, languages=languages, extensions=extensions,
|
||||||
series=series,shelves=shelves, title=_(u"Advanced Search"), cc=cc, page="advsearch")
|
series=series,shelves=shelves, title=_("Advanced Search"), cc=cc, page="advsearch")
|
||||||
|
|
||||||
|
|
||||||
def render_search_results(term, offset=None, order=None, limit=None):
|
def render_search_results(term, offset=None, order=None, limit=None):
|
||||||
join = db.books_series_link, db.Books.id == db.books_series_link.c.book, db.Series
|
if term:
|
||||||
entries, result_count, pagination = calibre_db.get_search_results(term,
|
join = db.books_series_link, db.Books.id == db.books_series_link.c.book, db.Series
|
||||||
config,
|
entries, result_count, pagination = calibre_db.get_search_results(term,
|
||||||
offset,
|
config,
|
||||||
order,
|
offset,
|
||||||
limit,
|
order,
|
||||||
*join)
|
limit,
|
||||||
|
*join)
|
||||||
|
else:
|
||||||
|
entries = list()
|
||||||
|
order = [None, None]
|
||||||
|
pagination = result_count = None
|
||||||
|
|
||||||
return render_title_template('search.html',
|
return render_title_template('search.html',
|
||||||
searchterm=term,
|
searchterm=term,
|
||||||
pagination=pagination,
|
pagination=pagination,
|
||||||
|
@ -389,7 +396,7 @@ def render_search_results(term, offset=None, order=None, limit=None):
|
||||||
adv_searchterm=term,
|
adv_searchterm=term,
|
||||||
entries=entries,
|
entries=entries,
|
||||||
result_count=result_count,
|
result_count=result_count,
|
||||||
title=_(u"Search"),
|
title=_("Search"),
|
||||||
page="search",
|
page="search",
|
||||||
order=order[1])
|
order=order[1])
|
||||||
|
|
||||||
|
|
112
cps/server.py
|
@ -21,12 +21,13 @@ import os
|
||||||
import errno
|
import errno
|
||||||
import signal
|
import signal
|
||||||
import socket
|
import socket
|
||||||
import subprocess # nosec
|
import asyncio
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from gevent.pywsgi import WSGIServer
|
from gevent.pywsgi import WSGIServer
|
||||||
from .gevent_wsgi import MyWSGIHandler
|
from .gevent_wsgi import MyWSGIHandler
|
||||||
from gevent.pool import Pool
|
from gevent.pool import Pool
|
||||||
|
from gevent.socket import socket as GeventSocket
|
||||||
from gevent import __version__ as _version
|
from gevent import __version__ as _version
|
||||||
from greenlet import GreenletExit
|
from greenlet import GreenletExit
|
||||||
import ssl
|
import ssl
|
||||||
|
@ -36,6 +37,7 @@ except ImportError:
|
||||||
from .tornado_wsgi import MyWSGIContainer
|
from .tornado_wsgi import MyWSGIContainer
|
||||||
from tornado.httpserver import HTTPServer
|
from tornado.httpserver import HTTPServer
|
||||||
from tornado.ioloop import IOLoop
|
from tornado.ioloop import IOLoop
|
||||||
|
from tornado import netutil
|
||||||
from tornado import version as _version
|
from tornado import version as _version
|
||||||
VERSION = 'Tornado ' + _version
|
VERSION = 'Tornado ' + _version
|
||||||
_GEVENT = False
|
_GEVENT = False
|
||||||
|
@ -95,7 +97,12 @@ class WebServer(object):
|
||||||
log.warning('Cert path: %s', certfile_path)
|
log.warning('Cert path: %s', certfile_path)
|
||||||
log.warning('Key path: %s', keyfile_path)
|
log.warning('Key path: %s', keyfile_path)
|
||||||
|
|
||||||
def _make_gevent_unix_socket(self, socket_file):
|
def _make_gevent_socket_activated(self):
|
||||||
|
# Reuse an already open socket on fd=SD_LISTEN_FDS_START
|
||||||
|
SD_LISTEN_FDS_START = 3
|
||||||
|
return GeventSocket(fileno=SD_LISTEN_FDS_START)
|
||||||
|
|
||||||
|
def _prepare_unix_socket(self, socket_file):
|
||||||
# the socket file must not exist prior to bind()
|
# the socket file must not exist prior to bind()
|
||||||
if os.path.exists(socket_file):
|
if os.path.exists(socket_file):
|
||||||
# avoid nuking regular files and symbolic links (could be a mistype or security issue)
|
# avoid nuking regular files and symbolic links (could be a mistype or security issue)
|
||||||
|
@ -103,35 +110,41 @@ class WebServer(object):
|
||||||
raise OSError(errno.EEXIST, os.strerror(errno.EEXIST), socket_file)
|
raise OSError(errno.EEXIST, os.strerror(errno.EEXIST), socket_file)
|
||||||
os.remove(socket_file)
|
os.remove(socket_file)
|
||||||
|
|
||||||
unix_sock = WSGIServer.get_listener(socket_file, family=socket.AF_UNIX)
|
|
||||||
self.unix_socket_file = socket_file
|
self.unix_socket_file = socket_file
|
||||||
|
|
||||||
# ensure current user and group have r/w permissions, no permissions for other users
|
def _make_gevent_listener(self):
|
||||||
# this way the socket can be shared in a semi-secure manner
|
|
||||||
# between the user running calibre-web and the user running the fronting webserver
|
|
||||||
os.chmod(socket_file, 0o660)
|
|
||||||
|
|
||||||
return unix_sock
|
|
||||||
|
|
||||||
def _make_gevent_socket(self):
|
|
||||||
if os.name != 'nt':
|
if os.name != 'nt':
|
||||||
|
socket_activated = os.environ.get("LISTEN_FDS")
|
||||||
|
if socket_activated:
|
||||||
|
sock = self._make_gevent_socket_activated()
|
||||||
|
sock_info = sock.getsockname()
|
||||||
|
return sock, "systemd-socket:" + _readable_listen_address(sock_info[0], sock_info[1])
|
||||||
unix_socket_file = os.environ.get("CALIBRE_UNIX_SOCKET")
|
unix_socket_file = os.environ.get("CALIBRE_UNIX_SOCKET")
|
||||||
if unix_socket_file:
|
if unix_socket_file:
|
||||||
return self._make_gevent_unix_socket(unix_socket_file), "unix:" + unix_socket_file
|
self._prepare_unix_socket(unix_socket_file)
|
||||||
|
unix_sock = WSGIServer.get_listener(unix_socket_file, family=socket.AF_UNIX)
|
||||||
|
# ensure current user and group have r/w permissions, no permissions for other users
|
||||||
|
# this way the socket can be shared in a semi-secure manner
|
||||||
|
# between the user running calibre-web and the user running the fronting webserver
|
||||||
|
os.chmod(unix_socket_file, 0o660)
|
||||||
|
|
||||||
|
return unix_sock, "unix:" + unix_socket_file
|
||||||
|
|
||||||
if self.listen_address:
|
if self.listen_address:
|
||||||
return (self.listen_address, self.listen_port), None
|
return ((self.listen_address, self.listen_port),
|
||||||
|
_readable_listen_address(self.listen_address, self.listen_port))
|
||||||
|
|
||||||
if os.name == 'nt':
|
if os.name == 'nt':
|
||||||
self.listen_address = '0.0.0.0'
|
self.listen_address = '0.0.0.0'
|
||||||
return (self.listen_address, self.listen_port), None
|
return ((self.listen_address, self.listen_port),
|
||||||
|
_readable_listen_address(self.listen_address, self.listen_port))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
address = ('::', self.listen_port)
|
address = ('::', self.listen_port)
|
||||||
sock = WSGIServer.get_listener(address, family=socket.AF_INET6)
|
sock = WSGIServer.get_listener(address, family=socket.AF_INET6)
|
||||||
except socket.error as ex:
|
except socket.error as ex:
|
||||||
log.error('%s', ex)
|
log.error('%s', ex)
|
||||||
log.warning('Unable to listen on "", trying on IPv4 only...')
|
log.warning('Unable to listen on {}, trying on IPv4 only...'.format(address))
|
||||||
address = ('', self.listen_port)
|
address = ('', self.listen_port)
|
||||||
sock = WSGIServer.get_listener(address, family=socket.AF_INET)
|
sock = WSGIServer.get_listener(address, family=socket.AF_INET)
|
||||||
|
|
||||||
|
@ -152,7 +165,7 @@ class WebServer(object):
|
||||||
# The value of __package__ indicates how Python was called. It may
|
# The value of __package__ indicates how Python was called. It may
|
||||||
# not exist if a setuptools script is installed as an egg. It may be
|
# not exist if a setuptools script is installed as an egg. It may be
|
||||||
# set incorrectly for entry points created with pip on Windows.
|
# set incorrectly for entry points created with pip on Windows.
|
||||||
if getattr(__main__, "__package__", None) is None or (
|
if getattr(__main__, "__package__", "") in ["", None] or (
|
||||||
os.name == "nt"
|
os.name == "nt"
|
||||||
and __main__.__package__ == ""
|
and __main__.__package__ == ""
|
||||||
and not os.path.exists(py_script)
|
and not os.path.exists(py_script)
|
||||||
|
@ -193,15 +206,15 @@ class WebServer(object):
|
||||||
rv.extend(("-m", py_module.lstrip(".")))
|
rv.extend(("-m", py_module.lstrip(".")))
|
||||||
|
|
||||||
rv.extend(args)
|
rv.extend(args)
|
||||||
|
if os.name == 'nt':
|
||||||
|
rv = ['"{}"'.format(a) for a in rv]
|
||||||
return rv
|
return rv
|
||||||
|
|
||||||
def _start_gevent(self):
|
def _start_gevent(self):
|
||||||
ssl_args = self.ssl_args or {}
|
ssl_args = self.ssl_args or {}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
sock, output = self._make_gevent_socket()
|
sock, output = self._make_gevent_listener()
|
||||||
if output is None:
|
|
||||||
output = _readable_listen_address(self.listen_address, self.listen_port)
|
|
||||||
log.info('Starting Gevent server on %s', output)
|
log.info('Starting Gevent server on %s', output)
|
||||||
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, handler_class=MyWSGIHandler,
|
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, handler_class=MyWSGIHandler,
|
||||||
error_log=log,
|
error_log=log,
|
||||||
|
@ -226,17 +239,42 @@ class WebServer(object):
|
||||||
if os.name == 'nt' and sys.version_info > (3, 7):
|
if os.name == 'nt' and sys.version_info > (3, 7):
|
||||||
import asyncio
|
import asyncio
|
||||||
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
||||||
log.info('Starting Tornado server on %s', _readable_listen_address(self.listen_address, self.listen_port))
|
try:
|
||||||
|
# Max Buffersize set to 200MB
|
||||||
|
http_server = HTTPServer(MyWSGIContainer(self.app),
|
||||||
|
max_buffer_size=209700000,
|
||||||
|
ssl_options=self.ssl_args)
|
||||||
|
|
||||||
# Max Buffersize set to 200MB
|
unix_socket_file = os.environ.get("CALIBRE_UNIX_SOCKET")
|
||||||
http_server = HTTPServer(MyWSGIContainer(self.app),
|
if os.environ.get("LISTEN_FDS") and os.name != 'nt':
|
||||||
max_buffer_size=209700000,
|
SD_LISTEN_FDS_START = 3
|
||||||
ssl_options=self.ssl_args)
|
sock = socket.socket(fileno=SD_LISTEN_FDS_START)
|
||||||
http_server.listen(self.listen_port, self.listen_address)
|
http_server.add_socket(sock)
|
||||||
self.wsgiserver = IOLoop.current()
|
sock.setblocking(0)
|
||||||
self.wsgiserver.start()
|
socket_name =sock.getsockname()
|
||||||
# wait for stop signal
|
output = "systemd-socket:" + _readable_listen_address(socket_name[0], socket_name[1])
|
||||||
self.wsgiserver.close(True)
|
elif unix_socket_file and os.name != 'nt':
|
||||||
|
self._prepare_unix_socket(unix_socket_file)
|
||||||
|
output = "unix:" + unix_socket_file
|
||||||
|
unix_socket = netutil.bind_unix_socket(self.unix_socket_file)
|
||||||
|
http_server.add_socket(unix_socket)
|
||||||
|
# ensure current user and group have r/w permissions, no permissions for other users
|
||||||
|
# this way the socket can be shared in a semi-secure manner
|
||||||
|
# between the user running calibre-web and the user running the fronting webserver
|
||||||
|
os.chmod(self.unix_socket_file, 0o660)
|
||||||
|
else:
|
||||||
|
output = _readable_listen_address(self.listen_address, self.listen_port)
|
||||||
|
http_server.listen(self.listen_port, self.listen_address)
|
||||||
|
log.info('Starting Tornado server on %s', output)
|
||||||
|
|
||||||
|
self.wsgiserver = IOLoop.current()
|
||||||
|
self.wsgiserver.start()
|
||||||
|
# wait for stop signal
|
||||||
|
self.wsgiserver.close(True)
|
||||||
|
finally:
|
||||||
|
if self.unix_socket_file:
|
||||||
|
os.remove(self.unix_socket_file)
|
||||||
|
self.unix_socket_file = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
try:
|
try:
|
||||||
|
@ -262,9 +300,16 @@ class WebServer(object):
|
||||||
|
|
||||||
log.info("Performing restart of Calibre-Web")
|
log.info("Performing restart of Calibre-Web")
|
||||||
args = self._get_args_for_reloading()
|
args = self._get_args_for_reloading()
|
||||||
subprocess.call(args, close_fds=True) # nosec
|
os.execv(args[0].lstrip('"').rstrip('"'), args)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def shutdown_scheduler():
|
||||||
|
from .services.background_scheduler import BackgroundScheduler
|
||||||
|
scheduler = BackgroundScheduler()
|
||||||
|
if scheduler:
|
||||||
|
scheduler.scheduler.shutdown()
|
||||||
|
|
||||||
def _killServer(self, __, ___):
|
def _killServer(self, __, ___):
|
||||||
self.stop()
|
self.stop()
|
||||||
|
|
||||||
|
@ -273,9 +318,14 @@ class WebServer(object):
|
||||||
updater_thread.stop()
|
updater_thread.stop()
|
||||||
|
|
||||||
log.info("webserver stop (restart=%s)", restart)
|
log.info("webserver stop (restart=%s)", restart)
|
||||||
|
self.shutdown_scheduler()
|
||||||
self.restart = restart
|
self.restart = restart
|
||||||
if self.wsgiserver:
|
if self.wsgiserver:
|
||||||
if _GEVENT:
|
if _GEVENT:
|
||||||
self.wsgiserver.close()
|
self.wsgiserver.close()
|
||||||
else:
|
else:
|
||||||
self.wsgiserver.add_callback_from_signal(self.wsgiserver.stop)
|
if restart:
|
||||||
|
self.wsgiserver.call_later(1.0, self.wsgiserver.stop)
|
||||||
|
else:
|
||||||
|
self.wsgiserver.asyncio_loop.call_soon_threadsafe(self.wsgiserver.stop)
|
||||||
|
|
||||||
|
|
|
@ -19,11 +19,9 @@
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
from base64 import b64decode, b64encode
|
from base64 import b64decode, b64encode
|
||||||
from jsonschema import validate, exceptions, __version__
|
from jsonschema import validate, exceptions
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
from urllib.parse import unquote
|
|
||||||
|
|
||||||
from flask import json
|
from flask import json
|
||||||
from .. import logger
|
from .. import logger
|
||||||
|
|
||||||
|
@ -32,10 +30,10 @@ log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
def b64encode_json(json_data):
|
def b64encode_json(json_data):
|
||||||
return b64encode(json.dumps(json_data).encode())
|
return b64encode(json.dumps(json_data).encode()).decode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
# Python3 has a timestamp() method we could be calling, however it's not avaiable in python2.
|
# Python3 has a timestamp() method we could be calling, however it's not available in python2.
|
||||||
def to_epoch_timestamp(datetime_object):
|
def to_epoch_timestamp(datetime_object):
|
||||||
return (datetime_object - datetime(1970, 1, 1)).total_seconds()
|
return (datetime_object - datetime(1970, 1, 1)).total_seconds()
|
||||||
|
|
||||||
|
@ -49,7 +47,7 @@ def get_datetime_from_json(json_object, field_name):
|
||||||
|
|
||||||
|
|
||||||
class SyncToken:
|
class SyncToken:
|
||||||
""" The SyncToken is used to persist state accross requests.
|
""" The SyncToken is used to persist state across requests.
|
||||||
When serialized over the response headers, the Kobo device will propagate the token onto following
|
When serialized over the response headers, the Kobo device will propagate the token onto following
|
||||||
requests to the service. As an example use-case, the SyncToken is used to detect books that have been added
|
requests to the service. As an example use-case, the SyncToken is used to detect books that have been added
|
||||||
to the library since the last time the device synced to the server.
|
to the library since the last time the device synced to the server.
|
||||||
|
|
|
@ -23,6 +23,8 @@ from .worker import WorkerThread
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from apscheduler.schedulers.background import BackgroundScheduler as BScheduler
|
from apscheduler.schedulers.background import BackgroundScheduler as BScheduler
|
||||||
|
from apscheduler.triggers.cron import CronTrigger
|
||||||
|
from apscheduler.triggers.date import DateTrigger
|
||||||
use_APScheduler = True
|
use_APScheduler = True
|
||||||
except (ImportError, RuntimeError) as e:
|
except (ImportError, RuntimeError) as e:
|
||||||
use_APScheduler = False
|
use_APScheduler = False
|
||||||
|
@ -43,35 +45,33 @@ class BackgroundScheduler:
|
||||||
cls.scheduler = BScheduler()
|
cls.scheduler = BScheduler()
|
||||||
cls.scheduler.start()
|
cls.scheduler.start()
|
||||||
|
|
||||||
atexit.register(lambda: cls.scheduler.shutdown())
|
|
||||||
|
|
||||||
return cls._instance
|
return cls._instance
|
||||||
|
|
||||||
def schedule(self, func, trigger, name=None, **trigger_args):
|
def schedule(self, func, trigger, name=None):
|
||||||
if use_APScheduler:
|
if use_APScheduler:
|
||||||
return self.scheduler.add_job(func=func, trigger=trigger, name=name, **trigger_args)
|
return self.scheduler.add_job(func=func, trigger=trigger, name=name)
|
||||||
|
|
||||||
# Expects a lambda expression for the task
|
# Expects a lambda expression for the task
|
||||||
def schedule_task(self, task, user=None, name=None, hidden=False, trigger='cron', **trigger_args):
|
def schedule_task(self, task, user=None, name=None, hidden=False, trigger=None):
|
||||||
if use_APScheduler:
|
if use_APScheduler:
|
||||||
def scheduled_task():
|
def scheduled_task():
|
||||||
worker_task = task()
|
worker_task = task()
|
||||||
worker_task.scheduled = True
|
worker_task.scheduled = True
|
||||||
WorkerThread.add(user, worker_task, hidden=hidden)
|
WorkerThread.add(user, worker_task, hidden=hidden)
|
||||||
return self.schedule(func=scheduled_task, trigger=trigger, name=name, **trigger_args)
|
return self.schedule(func=scheduled_task, trigger=trigger, name=name)
|
||||||
|
|
||||||
# Expects a list of lambda expressions for the tasks
|
# Expects a list of lambda expressions for the tasks
|
||||||
def schedule_tasks(self, tasks, user=None, trigger='cron', **trigger_args):
|
def schedule_tasks(self, tasks, user=None, trigger=None):
|
||||||
if use_APScheduler:
|
if use_APScheduler:
|
||||||
for task in tasks:
|
for task in tasks:
|
||||||
self.schedule_task(task[0], user=user, trigger=trigger, name=task[1], hidden=task[2], **trigger_args)
|
self.schedule_task(task[0], user=user, trigger=trigger, name=task[1], hidden=task[2])
|
||||||
|
|
||||||
# Expects a lambda expression for the task
|
# Expects a lambda expression for the task
|
||||||
def schedule_task_immediately(self, task, user=None, name=None, hidden=False):
|
def schedule_task_immediately(self, task, user=None, name=None, hidden=False):
|
||||||
if use_APScheduler:
|
if use_APScheduler:
|
||||||
def immediate_task():
|
def immediate_task():
|
||||||
WorkerThread.add(user, task(), hidden)
|
WorkerThread.add(user, task(), hidden)
|
||||||
return self.schedule(func=immediate_task, trigger='date', name=name)
|
return self.schedule(func=immediate_task, trigger=DateTrigger(), name=name)
|
||||||
|
|
||||||
# Expects a list of lambda expressions for the tasks
|
# Expects a list of lambda expressions for the tasks
|
||||||
def schedule_tasks_immediately(self, tasks, user=None):
|
def schedule_tasks_immediately(self, tasks, user=None):
|
||||||
|
|
|
@ -18,16 +18,49 @@
|
||||||
|
|
||||||
import time
|
import time
|
||||||
from functools import reduce
|
from functools import reduce
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from goodreads.client import GoodreadsClient
|
||||||
|
from goodreads.request import GoodreadsRequest
|
||||||
|
import xmltodict
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from goodreads.client import GoodreadsClient
|
import Levenshtein
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from betterreads.client import GoodreadsClient
|
Levenshtein = False
|
||||||
|
|
||||||
try: import Levenshtein
|
|
||||||
except ImportError: Levenshtein = False
|
|
||||||
|
|
||||||
from .. import logger
|
from .. import logger
|
||||||
|
from ..clean_html import clean_string
|
||||||
|
|
||||||
|
class my_GoodreadsClient(GoodreadsClient):
|
||||||
|
|
||||||
|
def request(self, *args, **kwargs):
|
||||||
|
"""Create a GoodreadsRequest object and make that request"""
|
||||||
|
req = my_GoodreadsRequest(self, *args, **kwargs)
|
||||||
|
return req.request()
|
||||||
|
|
||||||
|
class GoodreadsRequestException(Exception):
|
||||||
|
def __init__(self, error_msg, url):
|
||||||
|
self.error_msg = error_msg
|
||||||
|
self.url = url
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.url, ':', self.error_msg
|
||||||
|
|
||||||
|
|
||||||
|
class my_GoodreadsRequest(GoodreadsRequest):
|
||||||
|
|
||||||
|
def request(self):
|
||||||
|
resp = requests.get(self.host+self.path, params=self.params,
|
||||||
|
headers={"User-Agent":"Mozilla/5.0 (X11; Linux x86_64; rv:125.0) "
|
||||||
|
"Gecko/20100101 Firefox/125.0"})
|
||||||
|
if resp.status_code != 200:
|
||||||
|
raise GoodreadsRequestException(resp.reason, self.path)
|
||||||
|
if self.req_format == 'xml':
|
||||||
|
data_dict = xmltodict.parse(resp.content)
|
||||||
|
return data_dict['GoodreadsResponse']
|
||||||
|
else:
|
||||||
|
raise Exception("Invalid format")
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -38,20 +71,20 @@ _CACHE_TIMEOUT = 23 * 60 * 60 # 23 hours (in seconds)
|
||||||
_AUTHORS_CACHE = {}
|
_AUTHORS_CACHE = {}
|
||||||
|
|
||||||
|
|
||||||
def connect(key=None, secret=None, enabled=True):
|
def connect(key=None, enabled=True):
|
||||||
global _client
|
global _client
|
||||||
|
|
||||||
if not enabled or not key or not secret:
|
if not enabled or not key:
|
||||||
_client = None
|
_client = None
|
||||||
return
|
return
|
||||||
|
|
||||||
if _client:
|
if _client:
|
||||||
# make sure the configuration has not changed since last we used the client
|
# make sure the configuration has not changed since last we used the client
|
||||||
if _client.client_key != key or _client.client_secret != secret:
|
if _client.client_key != key:
|
||||||
_client = None
|
_client = None
|
||||||
|
|
||||||
if not _client:
|
if not _client:
|
||||||
_client = GoodreadsClient(key, secret)
|
_client = my_GoodreadsClient(key, None)
|
||||||
|
|
||||||
|
|
||||||
def get_author_info(author_name):
|
def get_author_info(author_name):
|
||||||
|
@ -76,6 +109,7 @@ def get_author_info(author_name):
|
||||||
|
|
||||||
if author_info:
|
if author_info:
|
||||||
author_info._timestamp = now
|
author_info._timestamp = now
|
||||||
|
author_info.safe_about = clean_string(author_info.about)
|
||||||
_AUTHORS_CACHE[author_name] = author_info
|
_AUTHORS_CACHE[author_name] = author_info
|
||||||
return author_info
|
return author_info
|
||||||
|
|
||||||
|
|
|
@ -20,6 +20,7 @@ import base64
|
||||||
|
|
||||||
from flask_simpleldap import LDAP, LDAPException
|
from flask_simpleldap import LDAP, LDAPException
|
||||||
from flask_simpleldap import ldap as pyLDAP
|
from flask_simpleldap import ldap as pyLDAP
|
||||||
|
from flask import current_app
|
||||||
from .. import constants, logger
|
from .. import constants, logger
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -28,8 +29,47 @@ except ImportError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
_ldap = LDAP()
|
|
||||||
|
|
||||||
|
class LDAPLogger(object):
|
||||||
|
|
||||||
|
def write(self, message):
|
||||||
|
try:
|
||||||
|
log.debug(message.strip("\n").replace("\n", ""))
|
||||||
|
except Exception:
|
||||||
|
log.debug("Logging Error")
|
||||||
|
|
||||||
|
|
||||||
|
class mySimpleLDap(LDAP):
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def init_app(app):
|
||||||
|
super(mySimpleLDap, mySimpleLDap).init_app(app)
|
||||||
|
app.config.setdefault('LDAP_LOGLEVEL', 0)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def initialize(self):
|
||||||
|
"""Initialize a connection to the LDAP server.
|
||||||
|
|
||||||
|
:return: LDAP connection object.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
log_level = 2 if current_app.config['LDAP_LOGLEVEL'] == logger.logging.DEBUG else 0
|
||||||
|
conn = pyLDAP.initialize('{0}://{1}:{2}'.format(
|
||||||
|
current_app.config['LDAP_SCHEMA'],
|
||||||
|
current_app.config['LDAP_HOST'],
|
||||||
|
current_app.config['LDAP_PORT']), trace_level=log_level, trace_file=LDAPLogger())
|
||||||
|
conn.set_option(pyLDAP.OPT_NETWORK_TIMEOUT,
|
||||||
|
current_app.config['LDAP_TIMEOUT'])
|
||||||
|
conn = self._set_custom_options(conn)
|
||||||
|
conn.protocol_version = pyLDAP.VERSION3
|
||||||
|
if current_app.config['LDAP_USE_TLS']:
|
||||||
|
conn.start_tls_s()
|
||||||
|
return conn
|
||||||
|
except pyLDAP.LDAPError as e:
|
||||||
|
raise LDAPException(self.error(e.args))
|
||||||
|
|
||||||
|
|
||||||
|
_ldap = mySimpleLDap()
|
||||||
|
|
||||||
def init_app(app, config):
|
def init_app(app, config):
|
||||||
if config.config_login_type != constants.LOGIN_LDAP:
|
if config.config_login_type != constants.LOGIN_LDAP:
|
||||||
|
@ -44,15 +84,15 @@ def init_app(app, config):
|
||||||
app.config['LDAP_SCHEMA'] = 'ldap'
|
app.config['LDAP_SCHEMA'] = 'ldap'
|
||||||
if config.config_ldap_authentication > constants.LDAP_AUTH_ANONYMOUS:
|
if config.config_ldap_authentication > constants.LDAP_AUTH_ANONYMOUS:
|
||||||
if config.config_ldap_authentication > constants.LDAP_AUTH_UNAUTHENTICATE:
|
if config.config_ldap_authentication > constants.LDAP_AUTH_UNAUTHENTICATE:
|
||||||
if config.config_ldap_serv_password is None:
|
if config.config_ldap_serv_password_e is None:
|
||||||
config.config_ldap_serv_password = ''
|
config.config_ldap_serv_password_e = ''
|
||||||
app.config['LDAP_PASSWORD'] = base64.b64decode(config.config_ldap_serv_password)
|
app.config['LDAP_PASSWORD'] = config.config_ldap_serv_password_e
|
||||||
else:
|
else:
|
||||||
app.config['LDAP_PASSWORD'] = base64.b64decode("")
|
app.config['LDAP_PASSWORD'] = ""
|
||||||
app.config['LDAP_USERNAME'] = config.config_ldap_serv_username
|
app.config['LDAP_USERNAME'] = config.config_ldap_serv_username
|
||||||
else:
|
else:
|
||||||
app.config['LDAP_USERNAME'] = ""
|
app.config['LDAP_USERNAME'] = ""
|
||||||
app.config['LDAP_PASSWORD'] = base64.b64decode("")
|
app.config['LDAP_PASSWORD'] = ""
|
||||||
if bool(config.config_ldap_cert_path):
|
if bool(config.config_ldap_cert_path):
|
||||||
app.config['LDAP_CUSTOM_OPTIONS'].update({
|
app.config['LDAP_CUSTOM_OPTIONS'].update({
|
||||||
pyLDAP.OPT_X_TLS_REQUIRE_CERT: pyLDAP.OPT_X_TLS_DEMAND,
|
pyLDAP.OPT_X_TLS_REQUIRE_CERT: pyLDAP.OPT_X_TLS_DEMAND,
|
||||||
|
@ -70,7 +110,7 @@ def init_app(app, config):
|
||||||
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
|
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
|
||||||
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
|
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
|
||||||
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
|
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
|
||||||
|
app.config['LDAP_LOGLEVEL'] = config.config_log_level
|
||||||
try:
|
try:
|
||||||
_ldap.init_app(app)
|
_ldap.init_app(app)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
|
|
|
@ -266,3 +266,6 @@ class CalibreTask:
|
||||||
def _handleSuccess(self):
|
def _handleSuccess(self):
|
||||||
self.stat = STAT_FINISH_SUCCESS
|
self.stat = STAT_FINISH_SUCCESS
|
||||||
self.progress = 1
|
self.progress = 1
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.name
|
||||||
|
|
83
cps/shelf.py
|
@ -46,13 +46,13 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
if shelf is None:
|
if shelf is None:
|
||||||
log.error("Invalid shelf specified: %s", shelf_id)
|
log.error("Invalid shelf specified: %s", shelf_id)
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Invalid shelf specified"), category="error")
|
flash(_("Invalid shelf specified"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Invalid shelf specified", 400
|
return "Invalid shelf specified", 400
|
||||||
|
|
||||||
if not check_shelf_edit_permissions(shelf):
|
if not check_shelf_edit_permissions(shelf):
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Sorry you are not allowed to add a book to that shelf"), category="error")
|
flash(_("Sorry you are not allowed to add a book to that shelf"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Sorry you are not allowed to add a book to the that shelf", 403
|
return "Sorry you are not allowed to add a book to the that shelf", 403
|
||||||
|
|
||||||
|
@ -61,7 +61,7 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
if book_in_shelf:
|
if book_in_shelf:
|
||||||
log.error("Book %s is already part of %s", book_id, shelf)
|
log.error("Book %s is already part of %s", book_id, shelf)
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Book is already part of the shelf: %(shelfname)s", shelfname=shelf.name), category="error")
|
flash(_("Book is already part of the shelf: %(shelfname)s", shelfname=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Book is already part of the shelf: %s" % shelf.name, 400
|
return "Book is already part of the shelf: %s" % shelf.name, 400
|
||||||
|
|
||||||
|
@ -71,6 +71,14 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
else:
|
else:
|
||||||
maxOrder = maxOrder[0]
|
maxOrder = maxOrder[0]
|
||||||
|
|
||||||
|
if not calibre_db.session.query(db.Books).filter(db.Books.id == book_id).one_or_none():
|
||||||
|
log.error("Invalid Book Id: %s. Could not be added to shelf %s", book_id, shelf.name)
|
||||||
|
if not xhr:
|
||||||
|
flash(_("%(book_id)s is a invalid Book Id. Could not be added to Shelf", book_id=book_id),
|
||||||
|
category="error")
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
|
return "%s is a invalid Book Id. Could not be added to Shelf" % book_id, 400
|
||||||
|
|
||||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
|
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
try:
|
try:
|
||||||
|
@ -79,14 +87,14 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
log.debug("Book has been added to shelf: {}".format(shelf.name))
|
log.debug("Book has been added to shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_("Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
|
@ -100,12 +108,12 @@ def search_to_shelf(shelf_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
if shelf is None:
|
if shelf is None:
|
||||||
log.error("Invalid shelf specified: {}".format(shelf_id))
|
log.error("Invalid shelf specified: {}".format(shelf_id))
|
||||||
flash(_(u"Invalid shelf specified"), category="error")
|
flash(_("Invalid shelf specified"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
if not check_shelf_edit_permissions(shelf):
|
if not check_shelf_edit_permissions(shelf):
|
||||||
log.warning("You are not allowed to add a book to the shelf".format(shelf.name))
|
log.warning("You are not allowed to add a book to the shelf".format(shelf.name))
|
||||||
flash(_(u"You are not allowed to add a book to the shelf"), category="error")
|
flash(_("You are not allowed to add a book to the shelf"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
if current_user.id in ub.searched_ids and ub.searched_ids[current_user.id]:
|
if current_user.id in ub.searched_ids and ub.searched_ids[current_user.id]:
|
||||||
|
@ -123,7 +131,7 @@ def search_to_shelf(shelf_id):
|
||||||
|
|
||||||
if not books_for_shelf:
|
if not books_for_shelf:
|
||||||
log.error("Books are already part of {}".format(shelf.name))
|
log.error("Books are already part of {}".format(shelf.name))
|
||||||
flash(_(u"Books are already part of the shelf: %(name)s", name=shelf.name), category="error")
|
flash(_("Books are already part of the shelf: %(name)s", name=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
maxOrder = ub.session.query(func.max(ub.BookShelf.order)).filter(ub.BookShelf.shelf == shelf_id).first()[0] or 0
|
maxOrder = ub.session.query(func.max(ub.BookShelf.order)).filter(ub.BookShelf.shelf == shelf_id).first()[0] or 0
|
||||||
|
@ -135,14 +143,14 @@ def search_to_shelf(shelf_id):
|
||||||
try:
|
try:
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_("Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
else:
|
else:
|
||||||
log.error("Could not add books to shelf: {}".format(shelf.name))
|
log.error("Could not add books to shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
flash(_("Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -182,13 +190,13 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_("Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
|
@ -197,7 +205,7 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
else:
|
else:
|
||||||
if not xhr:
|
if not xhr:
|
||||||
log.warning("You are not allowed to remove a book from shelf: {}".format(shelf.name))
|
log.warning("You are not allowed to remove a book from shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Sorry you are not allowed to remove a book from this shelf"),
|
flash(_("Sorry you are not allowed to remove a book from this shelf"),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Sorry you are not allowed to remove a book from this shelf", 403
|
return "Sorry you are not allowed to remove a book from this shelf", 403
|
||||||
|
@ -207,7 +215,7 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
@login_required
|
@login_required
|
||||||
def create_shelf():
|
def create_shelf():
|
||||||
shelf = ub.Shelf()
|
shelf = ub.Shelf()
|
||||||
return create_edit_shelf(shelf, page_title=_(u"Create a Shelf"), page="shelfcreate")
|
return create_edit_shelf(shelf, page_title=_("Create a Shelf"), page="shelfcreate")
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"])
|
@shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"])
|
||||||
|
@ -215,9 +223,9 @@ def create_shelf():
|
||||||
def edit_shelf(shelf_id):
|
def edit_shelf(shelf_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
if not check_shelf_edit_permissions(shelf):
|
if not check_shelf_edit_permissions(shelf):
|
||||||
flash(_(u"Sorry you are not allowed to edit this shelf"), category="error")
|
flash(_("Sorry you are not allowed to edit this shelf"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return create_edit_shelf(shelf, page_title=_(u"Edit a shelf"), page="shelfedit", shelf_id=shelf_id)
|
return create_edit_shelf(shelf, page_title=_("Edit a shelf"), page="shelfedit", shelf_id=shelf_id)
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/delete/<int:shelf_id>", methods=["POST"])
|
@shelf.route("/shelf/delete/<int:shelf_id>", methods=["POST"])
|
||||||
|
@ -232,7 +240,7 @@ def delete_shelf(shelf_id):
|
||||||
except InvalidRequestError as e:
|
except InvalidRequestError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -269,7 +277,7 @@ def order_shelf(shelf_id):
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
result = list()
|
result = list()
|
||||||
if shelf:
|
if shelf:
|
||||||
|
@ -278,7 +286,7 @@ def order_shelf(shelf_id):
|
||||||
.add_columns(calibre_db.common_filters().label("visible")) \
|
.add_columns(calibre_db.common_filters().label("visible")) \
|
||||||
.filter(ub.BookShelf.shelf == shelf_id).order_by(ub.BookShelf.order.asc()).all()
|
.filter(ub.BookShelf.shelf == shelf_id).order_by(ub.BookShelf.order.asc()).all()
|
||||||
return render_title_template('shelf_order.html', entries=result,
|
return render_title_template('shelf_order.html', entries=result,
|
||||||
title=_(u"Change order of Shelf: '%(name)s'", name=shelf.name),
|
title=_("Change order of Shelf: '%(name)s'", name=shelf.name),
|
||||||
shelf=shelf, page="shelforder")
|
shelf=shelf, page="shelforder")
|
||||||
else:
|
else:
|
||||||
abort(404)
|
abort(404)
|
||||||
|
@ -295,11 +303,14 @@ def check_shelf_edit_permissions(cur_shelf):
|
||||||
|
|
||||||
|
|
||||||
def check_shelf_view_permissions(cur_shelf):
|
def check_shelf_view_permissions(cur_shelf):
|
||||||
if cur_shelf.is_public:
|
try:
|
||||||
return True
|
if cur_shelf.is_public:
|
||||||
if current_user.is_anonymous or cur_shelf.user_id != current_user.id:
|
return True
|
||||||
log.error("User is unauthorized to view non-public shelf: {}".format(cur_shelf.name))
|
if current_user.is_anonymous or cur_shelf.user_id != current_user.id:
|
||||||
return False
|
log.error("User is unauthorized to view non-public shelf: {}".format(cur_shelf.name))
|
||||||
|
return False
|
||||||
|
except Exception as e:
|
||||||
|
log.error(e)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
@ -310,7 +321,7 @@ def create_edit_shelf(shelf, page_title, page, shelf_id=False):
|
||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
if not current_user.role_edit_shelfs() and to_save.get("is_public") == "on":
|
if not current_user.role_edit_shelfs() and to_save.get("is_public") == "on":
|
||||||
flash(_(u"Sorry you are not allowed to create a public shelf"), category="error")
|
flash(_("Sorry you are not allowed to create a public shelf"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
is_public = 1 if to_save.get("is_public") == "on" else 0
|
is_public = 1 if to_save.get("is_public") == "on" else 0
|
||||||
if config.config_kobo_sync:
|
if config.config_kobo_sync:
|
||||||
|
@ -327,24 +338,24 @@ def create_edit_shelf(shelf, page_title, page, shelf_id=False):
|
||||||
shelf.user_id = int(current_user.id)
|
shelf.user_id = int(current_user.id)
|
||||||
ub.session.add(shelf)
|
ub.session.add(shelf)
|
||||||
shelf_action = "created"
|
shelf_action = "created"
|
||||||
flash_text = _(u"Shelf %(title)s created", title=shelf_title)
|
flash_text = _("Shelf %(title)s created", title=shelf_title)
|
||||||
else:
|
else:
|
||||||
shelf_action = "changed"
|
shelf_action = "changed"
|
||||||
flash_text = _(u"Shelf %(title)s changed", title=shelf_title)
|
flash_text = _("Shelf %(title)s changed", title=shelf_title)
|
||||||
try:
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
log.info(u"Shelf {} {}".format(shelf_title, shelf_action))
|
log.info("Shelf {} {}".format(shelf_title, shelf_action))
|
||||||
flash(flash_text, category="success")
|
flash(flash_text, category="success")
|
||||||
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
||||||
except (OperationalError, InvalidRequestError) as ex:
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
log.error_or_exception("Settings Database error: {}".format(ex))
|
log.error_or_exception("Settings Database error: {}".format(ex))
|
||||||
flash(_(u"Database error: %(error)s.", error=ex.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=ex.orig), category="error")
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
flash(_(u"There was an error"), category="error")
|
flash(_("There was an error"), category="error")
|
||||||
return render_title_template('shelf_edit.html',
|
return render_title_template('shelf_edit.html',
|
||||||
shelf=shelf,
|
shelf=shelf,
|
||||||
title=page_title,
|
title=page_title,
|
||||||
|
@ -366,7 +377,7 @@ def check_shelf_is_unique(title, is_public, shelf_id=False):
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
if not is_shelf_name_unique:
|
||||||
log.error("A public shelf with the name '{}' already exists.".format(title))
|
log.error("A public shelf with the name '{}' already exists.".format(title))
|
||||||
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=title),
|
flash(_("A public shelf with the name '%(title)s' already exists.", title=title),
|
||||||
category="error")
|
category="error")
|
||||||
else:
|
else:
|
||||||
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
||||||
|
@ -377,7 +388,7 @@ def check_shelf_is_unique(title, is_public, shelf_id=False):
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
if not is_shelf_name_unique:
|
||||||
log.error("A private shelf with the name '{}' already exists.".format(title))
|
log.error("A private shelf with the name '{}' already exists.".format(title))
|
||||||
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=title),
|
flash(_("A private shelf with the name '%(title)s' already exists.", title=title),
|
||||||
category="error")
|
category="error")
|
||||||
return is_shelf_name_unique
|
return is_shelf_name_unique
|
||||||
|
|
||||||
|
@ -454,14 +465,14 @@ def render_show_shelf(shelf_type, shelf_id, page_no, sort_param):
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error_or_exception("Settings Database error: {}".format(e))
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
flash(_("Oops! Database Error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
return render_title_template(page,
|
return render_title_template(page,
|
||||||
entries=result,
|
entries=result,
|
||||||
pagination=pagination,
|
pagination=pagination,
|
||||||
title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
title=_("Shelf: '%(name)s'", name=shelf.name),
|
||||||
shelf=shelf,
|
shelf=shelf,
|
||||||
page="shelf")
|
page="shelf")
|
||||||
else:
|
else:
|
||||||
flash(_(u"Error opening shelf. Shelf does not exist or is not accessible"), category="error")
|
flash(_("Error opening shelf. Shelf does not exist or is not accessible"), category="error")
|
||||||
return redirect(url_for("web.index"))
|
return redirect(url_for("web.index"))
|
||||||
|
|
|
@ -3290,10 +3290,13 @@ div.btn-group[role=group][aria-label="Download, send to Kindle, reading"] .dropd
|
||||||
-ms-transform-origin: center top;
|
-ms-transform-origin: center top;
|
||||||
transform-origin: center top;
|
transform-origin: center top;
|
||||||
border: 0;
|
border: 0;
|
||||||
left: 0 !important;
|
|
||||||
overflow-y: auto;
|
overflow-y: auto;
|
||||||
}
|
}
|
||||||
|
.dropdown-menu:not(.datepicker-dropdown):not(.profileDropli) {
|
||||||
|
left: 0 !important;
|
||||||
|
}
|
||||||
#add-to-shelves {
|
#add-to-shelves {
|
||||||
|
min-height: 48px;
|
||||||
max-height: calc(100% - 120px);
|
max-height: calc(100% - 120px);
|
||||||
overflow-y: auto;
|
overflow-y: auto;
|
||||||
}
|
}
|
||||||
|
@ -4423,38 +4426,6 @@ body.advanced_search > div.container-fluid > div.row-fluid > div.col-sm-10 > div
|
||||||
left: 49px;
|
left: 49px;
|
||||||
margin-top: 5px
|
margin-top: 5px
|
||||||
}
|
}
|
||||||
|
|
||||||
body:not(.blur) > .navbar > .container-fluid > .navbar-header:after, body:not(.blur) > .navbar > .container-fluid > .navbar-header:before {
|
|
||||||
color: hsla(0, 0%, 100%, .7);
|
|
||||||
cursor: pointer;
|
|
||||||
display: block;
|
|
||||||
font-family: plex-icons-new, serif;
|
|
||||||
font-size: 20px;
|
|
||||||
font-stretch: 100%;
|
|
||||||
font-style: normal;
|
|
||||||
font-variant-caps: normal;
|
|
||||||
font-variant-east-asian: normal;
|
|
||||||
font-variant-numeric: normal;
|
|
||||||
font-weight: 400;
|
|
||||||
height: 60px;
|
|
||||||
letter-spacing: normal;
|
|
||||||
line-height: 60px;
|
|
||||||
position: absolute
|
|
||||||
}
|
|
||||||
|
|
||||||
body:not(.blur) > .navbar > .container-fluid > .navbar-header:before {
|
|
||||||
content: "\EA30";
|
|
||||||
-webkit-font-variant-ligatures: normal;
|
|
||||||
font-variant-ligatures: normal;
|
|
||||||
left: 20px
|
|
||||||
}
|
|
||||||
|
|
||||||
body:not(.blur) > .navbar > .container-fluid > .navbar-header:after {
|
|
||||||
content: "\EA2F";
|
|
||||||
-webkit-font-variant-ligatures: normal;
|
|
||||||
font-variant-ligatures: normal;
|
|
||||||
left: 60px
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
body.admin > div.container-fluid > div > div.col-sm-10 > div.container-fluid > div.row:first-of-type > div.col > h2:before, body.admin > div.container-fluid > div > div.col-sm-10 > div.discover > h2:first-of-type:before, body.edituser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.discover > h1:before, body.newuser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.discover > h1:before {
|
body.admin > div.container-fluid > div > div.col-sm-10 > div.container-fluid > div.row:first-of-type > div.col > h2:before, body.admin > div.container-fluid > div > div.col-sm-10 > div.discover > h2:first-of-type:before, body.edituser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.discover > h1:before, body.newuser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.discover > h1:before {
|
||||||
|
@ -4842,8 +4813,14 @@ body.advsearch:not(.blur) > div.container-fluid > div.row-fluid > div.col-sm-10
|
||||||
z-index: 999999999999999999999999999999999999
|
z-index: 999999999999999999999999999999999999
|
||||||
}
|
}
|
||||||
|
|
||||||
.search #shelf-actions, body.login .home-btn {
|
body.search #shelf-actions button#add-to-shelf {
|
||||||
display: none
|
height: 40px;
|
||||||
|
}
|
||||||
|
@media screen and (max-width: 767px) {
|
||||||
|
body.search .discover, body.advsearch .discover {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
body.read:not(.blur) a[href*=readbooks] {
|
body.read:not(.blur) a[href*=readbooks] {
|
||||||
|
@ -5164,7 +5141,7 @@ body.login > div.navbar.navbar-default.navbar-static-top > div > div.navbar-head
|
||||||
right: 5px
|
right: 5px
|
||||||
}
|
}
|
||||||
|
|
||||||
#shelf-actions > .btn-group.open, .downloadBtn.open, .profileDrop[aria-expanded=true] {
|
body:not(.search) #shelf-actions > .btn-group.open, .downloadBtn.open, .profileDrop[aria-expanded=true] {
|
||||||
pointer-events: none
|
pointer-events: none
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -5181,7 +5158,7 @@ body.login > div.navbar.navbar-default.navbar-static-top > div > div.navbar-head
|
||||||
color: var(--color-primary)
|
color: var(--color-primary)
|
||||||
}
|
}
|
||||||
|
|
||||||
#shelf-actions, #shelf-actions > .btn-group, #shelf-actions > .btn-group > .empty-ul {
|
body:not(.search) #shelf-actions, body:not(.search) #shelf-actions > .btn-group, body:not(.search) #shelf-actions > .btn-group > .empty-ul {
|
||||||
pointer-events: none
|
pointer-events: none
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -7309,6 +7286,11 @@ body.edituser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.
|
||||||
float: right
|
float: right
|
||||||
}
|
}
|
||||||
|
|
||||||
|
body.blur #main-nav + #scnd-nav .create-shelf, body.blur #main-nav + .col-sm-2 #scnd-nav .create-shelf {
|
||||||
|
float: none;
|
||||||
|
margin: 5px 0 10px -10px;
|
||||||
|
}
|
||||||
|
|
||||||
#main-nav + #scnd-nav .nav-head.hidden-xs {
|
#main-nav + #scnd-nav .nav-head.hidden-xs {
|
||||||
display: list-item !important;
|
display: list-item !important;
|
||||||
width: 225px
|
width: 225px
|
||||||
|
|
|
@ -22,3 +22,7 @@ body.serieslist.grid-view div.container-fluid > div > div.col-sm-10::before {
|
||||||
padding: 0 0;
|
padding: 0 0;
|
||||||
line-height: 15px;
|
line-height: 15px;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
input.datepicker {color: transparent}
|
||||||
|
input.datepicker:focus {color: transparent}
|
||||||
|
input.datepicker:focus + input {color: #555}
|
||||||
|
|
|
@ -149,6 +149,20 @@ body {
|
||||||
word-wrap: break-word;
|
word-wrap: break-word;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#mainContent > canvas {
|
||||||
|
display: block;
|
||||||
|
margin-left: auto;
|
||||||
|
margin-right: auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.long-strip > .mainImage {
|
||||||
|
margin-bottom: 4px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.long-strip > .mainImage:last-child {
|
||||||
|
margin-bottom: 0px !important;
|
||||||
|
}
|
||||||
|
|
||||||
#titlebar {
|
#titlebar {
|
||||||
min-height: 25px;
|
min-height: 25px;
|
||||||
height: auto;
|
height: auto;
|
||||||
|
|
|
@ -1,6 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M8 12a1 1 0 0 1-.707-.293l-5-5a1 1 0 0 1 1.414-1.414L8
|
|
||||||
9.586l4.293-4.293a1 1 0 0 1 1.414 1.414l-5 5A1 1 0 0 1 8 12z"></path></svg>
|
|
Before Width: | Height: | Size: 461 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M13 11a1 1 0 0 1-.707-.293L8 6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414l5-5a1 1 0 0 1 1.414 0l5 5A1 1 0 0 1 13 11z"></path></svg>
|
|
Before Width: | Height: | Size: 458 B |
Before Width: | Height: | Size: 326 B |
Before Width: | Height: | Size: 326 B |
|
@ -1,16 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16
|
|
||||||
16"
|
|
||||||
fill="rgba(255,255,255,1)">
|
|
||||||
<path
|
|
||||||
d="M8 16a8 8 0 1 1 8-8 8.009 8.009 0 0 1-8 8zM8 2a6 6 0 1 0 6 6 6.006 6.006 0 0 0-6-6z">
|
|
||||||
</path>
|
|
||||||
<path
|
|
||||||
d="M8 7a1 1 0 0 0-1 1v3a1 1 0 0 0 2 0V8a1 1 0 0 0-1-1z">
|
|
||||||
</path>
|
|
||||||
<circle
|
|
||||||
cx="8" cy="5" r="1.188">
|
|
||||||
</circle>
|
|
||||||
</svg>
|
|
Before Width: | Height: | Size: 557 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M13 13c-.3 0-.5-.1-.7-.3L8 8.4l-4.3 4.3c-.9.9-2.3-.5-1.4-1.4l5-5c.4-.4 1-.4 1.4 0l5 5c.6.6.2 1.7-.7 1.7zm0-11H3C1.7 2 1.7 4 3 4h10c1.3 0 1.3-2 0-2z"/></svg>
|
|
Before Width: | Height: | Size: 255 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M15 3.7V13c0 1.5-1.53 3-3 3H7.13c-.72 0-1.63-.5-2.13-1l-5-5s.84-1 .87-1c.13-.1.33-.2.53-.2.1 0 .3.1.4.2L4 10.6V2.7c0-.6.4-1 1-1s1 .4 1 1v4.6h1V1c0-.6.4-1 1-1s1 .4 1 1v6.3h1V1.7c0-.6.4-1 1-1s1 .4 1 1v5.7h1V3.7c0-.6.4-1 1-1s1 .4 1 1z"/></svg>
|
|
Before Width: | Height: | Size: 339 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M8 10c-.3 0-.5-.1-.7-.3l-5-5c-.9-.9.5-2.3 1.4-1.4L8 7.6l4.3-4.3c.9-.9 2.3.5 1.4 1.4l-5 5c-.2.2-.4.3-.7.3zm5 2H3c-1.3 0-1.3 2 0 2h10c1.3 0 1.3-2 0-2z"/></svg>
|
|
Before Width: | Height: | Size: 256 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M1 1a1 1 0 011 1v2.4A7 7 0 118 15a7 7 0 01-4.9-2 1 1 0 011.4-1.5 5 5 0 10-1-5.5H6a1 1 0 010 2H1a1 1 0 01-1-1V2a1 1 0 011-1z"/></svg>
|
|
Before Width: | Height: | Size: 231 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M15 1a1 1 0 0 0-1 1v2.418A6.995 6.995 0 1 0 8 15a6.954 6.954 0 0 0 4.95-2.05 1 1 0 0 0-1.414-1.414A5.019 5.019 0 1 1 12.549 6H10a1 1 0 0 0 0 2h5a1 1 0 0 0 1-1V2a1 1 0 0 0-1-1z"></path></svg>
|
|
Before Width: | Height: | Size: 521 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M0 4h1.5c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5H0zM9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM16 4h-1.5c-1 0-1.5.5-1.5 1.5v5c0 1 .5 1.5 1.5 1.5H16z"/></svg>
|
|
Before Width: | Height: | Size: 302 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="M9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4z"/></svg>
|
After Width: | Height: | Size: 171 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM11 0v.5c0 1-.5 1.5-1.5 1.5h-3C5.5 2 5 1.5 5 .5V0h6zM11 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6z"/></svg>
|
|
Before Width: | Height: | Size: 307 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M5.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C1 4.5 1.5 4 2.5 4zM7 0v.5C7 1.5 6.5 2 5.5 2h-3C1.5 2 1 1.5 1 .5V0h6zM7 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6zM13.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5c0-1 .5-1.5 1.5-1.5zM15 0v.5c0 1-.5 1.5-1.5 1.5h-3C9.5 2 9 1.5 9 .5V0h6zM15 16v-.507c0-1-.5-1.5-1.5-1.5h-3C9.5 14 9 14.5 9 15.5v.5h6z"/></svg>
|
|
Before Width: | Height: | Size: 509 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M12.408 8.217l-8.083-6.7A.2.2 0 0 0 4 1.672V12.3a.2.2 0 0 0 .333.146l2.56-2.372 1.857 3.9A1.125 1.125 0 1 0 10.782 13L8.913 9.075l3.4-.51a.2.2 0 0 0 .095-.348z"></path></svg>
|
|
Before Width: | Height: | Size: 505 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M1.5 3.5C.5 3.5 0 4 0 5v6.5c0 1 .5 1.5 1.5 1.5h4c1 0 1.5-.5 1.5-1.5V5c0-1-.5-1.5-1.5-1.5zm2 1.2c.8 0 1.4.2 1.8.6.5.4.7 1 .7 1.7 0 .5-.2 1-.5 1.4-.2.3-.5.7-1 1l-.6.4c-.4.3-.6.4-.75.56-.15.14-.25.24-.35.44H6v1.3H1c0-.6.1-1.1.3-1.5.3-.6.7-1 1.5-1.6.7-.4 1.1-.8 1.28-1 .32-.3.42-.6.42-1 0-.3-.1-.6-.23-.8-.17-.2-.37-.3-.77-.3s-.7.1-.9.5c-.04.2-.1.5-.1.9H1.1c0-.6.1-1.1.3-1.5.4-.7 1.1-1.1 2.1-1.1zM10.54 3.54C9.5 3.54 9 4 9 5v6.5c0 1 .5 1.5 1.54 1.5h4c.96 0 1.46-.5 1.46-1.5V5c0-1-.5-1.46-1.5-1.46zm1.9.95c.7 0 1.3.2 1.7.5.4.4.6.8.6 1.4 0 .4-.1.8-.4 1.1-.2.2-.3.3-.5.4.1 0 .3.1.6.3.4.3.5.8.5 1.4 0 .6-.2 1.2-.6 1.6-.4.5-1.1.7-1.9.7-1 0-1.8-.3-2.2-1-.14-.29-.24-.69-.24-1.29h1.4c0 .3 0 .5.1.7.2.4.5.5 1 .5.3 0 .5-.1.7-.3.2-.2.3-.5.3-.8 0-.5-.2-.8-.6-.95-.2-.05-.5-.15-1-.15v-1c.5 0 .8-.1 1-.14.3-.1.5-.4.5-.9 0-.3-.1-.5-.2-.7-.2-.2-.4-.3-.7-.3-.3 0-.6.1-.75.3-.2.2-.2.5-.2.86h-1.34c0-.4.1-.7.19-1.1 0-.12.2-.32.4-.62.2-.2.4-.3.7-.4.3-.1.6-.1 1-.1z"/></svg>
|
|
Before Width: | Height: | Size: 1.0 KiB |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M6 3c-1 0-1.5.5-1.5 1.5v7c0 1 .5 1.5 1.5 1.5h4c1 0 1.5-.5 1.5-1.5v-7c0-1-.5-1.5-1.5-1.5z"/></svg>
|
|
Before Width: | Height: | Size: 196 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M10.56 3.5C9.56 3.5 9 4 9 5v6.5c0 1 .5 1.5 1.5 1.5h4c1 0 1.5-.5 1.5-1.5V5c0-1-.5-1.5-1.5-1.5zm1.93 1.2c.8 0 1.4.2 1.8.64.5.4.7 1 .7 1.7 0 .5-.2 1-.5 1.44-.2.3-.6.6-1 .93l-.6.4c-.4.3-.6.4-.7.55-.1.1-.2.2-.3.4h3.2v1.27h-5c0-.5.1-1 .3-1.43.2-.49.7-1 1.5-1.54.7-.5 1.1-.8 1.3-1.02.3-.3.4-.7.4-1.05 0-.3-.1-.6-.3-.77-.2-.2-.4-.3-.7-.3-.4 0-.7.2-.9.5-.1.2-.1.5-.2.9h-1.4c0-.6.2-1.1.3-1.5.4-.7 1.1-1.1 2-1.1zM1.54 3.5C.54 3.5 0 4 0 5v6.5c0 1 .5 1.5 1.54 1.5h4c1 0 1.5-.5 1.5-1.5V5c0-1-.5-1.5-1.5-1.5zm1.8 1.125H4.5V12H3V6.9H1.3v-1c.5 0 .8 0 .97-.03.33-.07.53-.17.73-.37.1-.2.2-.3.25-.5.05-.2.05-.3.05-.3z"/></svg>
|
|
Before Width: | Height: | Size: 705 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M4 16V2s0-1 1-1h6s1 0 1 1v14l-4-5z"/></svg>
|
|
Before Width: | Height: | Size: 142 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="m14 9h-6c-1.3 0-1.3 2 0 2h6c1.3 0 1.3-2 0-2zm-5.2-8h-3.8c-1.3 0-1.3 2 0 2h1.7zm-6.8 0c-1 0-1.3 1-0.7 1.7 0.7 0.6 1.7 0.3 1.7-0.7 0-0.5-0.4-1-1-1zm3 8c-1 0-1.3 1-0.7 1.7 0.6 0.6 1.7 0.2 1.7-0.7 0-0.5-0.4-1-1-1zm0.3-4h-0.3c-1.4 0-1.4 2 0 2h2.3zm-3.3 0c-0.9 0-1.4 1-0.7 1.7 0.7 0.6 1.7 0.2 1.7-0.7 0-0.6-0.5-1-1-1zm12 8h-9c-1.3 0-1.3 2 0 2h9c1.3 0 1.3-2 0-2zm-12 0c-1 0-1.3 1-0.7 1.7 0.7 0.6 1.7 0.2 1.7-0.712 0-0.5-0.4-1-1-1z"/><path d="m7.37 4.838 3.93-3.911v2.138h3.629v3.546h-3.629v2.138l-3.93-3.911"/></svg>
|
After Width: | Height: | Size: 581 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M14 3h-2v2h2v8H2V5h7V3h-.849L6.584 1.538A2 2 0 0 0 5.219 1H2a2 2 0 0 0-2 2v10a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V5a2 2 0 0 0-2-2zM2 3h3.219l1.072 1H2z"></path><path d="M8.146 6.146a.5.5 0 0 0 0 .707l2 2a.5.5 0 0 0 .707 0l2-2a.5.5 0 1 0-.707-.707L11 7.293V.5a.5.5 0 0 0-1 0v6.793L8.854 6.146a.5.5 0 0 0-.708 0z"></path></svg>
|
|
Before Width: | Height: | Size: 651 B |
|
@ -0,0 +1,24 @@
|
||||||
|
<?xml version="1.0" encoding="iso-8859-1"?>
|
||||||
|
<!-- copied from https://www.svgrepo.com/svg/255881/text -->
|
||||||
|
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
|
||||||
|
viewBox="0 0 16 16" style="enable-background:new 0 0 16 16;" xml:space="preserve">
|
||||||
|
<g>
|
||||||
|
<g transform="scale(0.03125)">
|
||||||
|
<path d="M405.787,43.574H8.17c-4.513,0-8.17,3.658-8.17,8.17v119.83c0,4.512,3.657,8.17,8.17,8.17h32.681
|
||||||
|
c4.513,0,8.17-3.658,8.17-8.17v-24.511h95.319v119.83c0,4.512,3.657,8.17,8.17,8.17c4.513,0,8.17-3.658,8.17-8.17v-128
|
||||||
|
c0-4.512-3.657-8.17-8.17-8.17H40.851c-4.513,0-8.17,3.658-8.17,8.17v24.511H16.34V59.915h381.277v103.489h-16.34v-24.511
|
||||||
|
c0-4.512-3.657-8.17-8.17-8.17h-111.66c-4.513,0-8.17,3.658-8.17,8.17v288.681c0,4.512,3.657,8.17,8.17,8.17h57.191v16.34H95.319
|
||||||
|
v-16.34h57.191c4.513,0,8.17-3.658,8.17-8.17v-128c0-4.512-3.657-8.17-8.17-8.17c-4.513,0-8.17,3.658-8.17,8.17v119.83H87.149
|
||||||
|
c-4.513,0-8.17,3.658-8.17,8.17v32.681c0,4.512,3.657,8.17,8.17,8.17h239.66c4.513,0,8.17-3.658,8.17-8.17v-32.681
|
||||||
|
c0-4.512-3.657-8.17-8.17-8.17h-57.192v-272.34h95.319v24.511c0,4.512,3.657,8.17,8.17,8.17h32.681c4.513,0,8.17-3.658,8.17-8.17
|
||||||
|
V51.745C413.957,47.233,410.3,43.574,405.787,43.574z"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
<g>
|
||||||
|
<g transform="scale(0.03125)">
|
||||||
|
<path d="M503.83,452.085h-24.511V59.915h24.511c4.513,0,8.17-3.658,8.17-8.17s-3.657-8.17-8.17-8.17h-65.362
|
||||||
|
c-4.513,0-8.17,3.658-8.17,8.17s3.657,8.17,8.17,8.17h24.511v392.17h-24.511c-4.513,0-8.17,3.658-8.17,8.17s3.657,8.17,8.17,8.17
|
||||||
|
h65.362c4.513,0,8.17-3.658,8.17-8.17S508.343,452.085,503.83,452.085z"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
</svg>
|
After Width: | Height: | Size: 1.6 KiB |
|
@ -0,0 +1,9 @@
|
||||||
|
<?xml version='1.0' encoding='utf-8'?>
|
||||||
|
<!DOCTYPE svg PUBLIC '-//W3C//DTD SVG 1.1//EN' 'http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd'>
|
||||||
|
<svg version="1.1" xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16" xmlns:xlink="http://www.w3.org/1999/xlink" enable-background="new 0 0 16 16">
|
||||||
|
<g>
|
||||||
|
<g transform="scale(0.03125)">
|
||||||
|
<path d="m455.1,137.9l-32.4,32.4-81-81.1 32.4-32.4c6.6-6.6 18.1-6.6 24.7,0l56.3,56.4c6.8,6.8 6.8,17.9 0,24.7zm-270.7,271l-81-81.1 209.4-209.7 81,81.1-209.4,209.7zm-99.7-42l60.6,60.7-84.4,23.8 23.8-84.5zm399.3-282.6l-56.3-56.4c-11-11-50.7-31.8-82.4,0l-285.3,285.5c-2.5,2.5-4.3,5.5-5.2,8.9l-43,153.1c-2,7.1 0.1,14.7 5.2,20 5.2,5.3 15.6,6.2 20,5.2l153-43.1c3.4-0.9 6.4-2.7 8.9-5.2l285.1-285.5c22.7-22.7 22.7-59.7 0-82.5z"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
</svg>
|
After Width: | Height: | Size: 804 B |
|
@ -1 +0,0 @@
|
||||||
<svg width="16" height="16" xmlns="http://www.w3.org/2000/svg" fill="rgba(255,255,255,1)"><path d="M8 11a1 1 0 01-.707-.293l-2.99-2.99c-.91-.942.471-2.324 1.414-1.414L8 8.586l2.283-2.283c.943-.91 2.324.472 1.414 1.414l-2.99 2.99A1 1 0 018 11z"/></svg>
|
|
Before Width: | Height: | Size: 251 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M14.859 3.2a1.335 1.335 0 0 1-1.217.8H13v1h1v8H2V5h8V4h-.642a1.365 1.365 0 0 1-1.325-1.11L6.584 1.538A2 2 0 0 0 5.219 1H2a2 2 0 0 0-2 2v10a2 2 0 0 0 2 2h12a2 2 0 0 0 2-2V5a2 2 0 0 0-1.141-1.8zM2 3h3.219l1.072 1H2zm7.854-.146L11 1.707V8.5a.5.5 0 0 0 1 0V1.707l1.146 1.146a.5.5 0 1 0 .707-.707l-2-2a.5.5 0 0 0-.707 0l-2 2a.5.5 0 0 0 .707.707z"></path></svg>
|
|
Before Width: | Height: | Size: 686 B |
|
@ -1,8 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16
|
|
||||||
16"
|
|
||||||
fill="rgba(255,255,255,1)"><path transform='rotate(90) translate(0, -16)'
|
|
||||||
d="M15.707 7.293l-6-6a1 1 0 0 0-1.414 1.414L12.586 7H1a1 1 0 0 0 0 2h11.586l-4.293
|
|
||||||
4.293a1 1 0 1 0 1.414 1.414l6-6a1 1 0 0 0 0-1.414z"></path></svg>
|
|
Before Width: | Height: | Size: 517 B |
|
@ -1,13 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16
|
|
||||||
16"
|
|
||||||
fill="rgba(255,255,255,1)">
|
|
||||||
<path
|
|
||||||
transform='rotate(90) translate(0, -16)'
|
|
||||||
d="M15 7H3.414l4.293-4.293a1 1 0 0
|
|
||||||
0-1.414-1.414l-6 6a1 1 0 0 0 0 1.414l6 6a1 1 0 0 0 1.414-1.414L3.414 9H15a1 1 0 0
|
|
||||||
0 0-2z">
|
|
||||||
</path>
|
|
||||||
</svg>
|
|
Before Width: | Height: | Size: 517 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M.5 1H7s0-1 1-1 1 1 1 1h6.5s.5 0 .5.5-.5.5-.5.5H.5S0 2 0 1.5.5 1 .5 1zM1 3h14v7c0 2-1 2-2 2H3c-1 0-2 0-2-2zm5 1v7l6-3.5zM3.72 15.33l.53-2s0-.5.65-.35c.51.13.38.63.38.63l-.53 2s0 .5-.64.35c-.53-.13-.39-.63-.39-.63zM11.24 15.61l-.53-1.99s0-.5.38-.63c.51-.13.64.35.64.35l.53 2s0 .5-.38.63c-.5.13-.64-.35-.65-.35z"/></svg>
|
|
Before Width: | Height: | Size: 417 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M14 5h-1V1a1 1 0 0 0-1-1H4a1 1 0 0 0-1 1v4H2a2 2 0 0 0-2 2v5h3v3a1 1 0 0 0 1 1h8a1 1 0 0 0 1-1v-3h3V7a2 2 0 0 0-2-2zM2.5 8a.5.5 0 1 1 .5-.5.5.5 0 0 1-.5.5zm9.5 7H4v-5h8zm0-10H4V1h8zm-6.5 7h4a.5.5 0 0 0 0-1h-4a.5.5 0 1 0 0 1zm0 2h5a.5.5 0 0 0 0-1h-5a.5.5 0 1 0 0 1z"></path></svg>
|
|
Before Width: | Height: | Size: 610 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M15.707 14.293l-4.822-4.822a6.019 6.019 0 1 0-1.414 1.414l4.822 4.822a1 1 0 0 0 1.414-1.414zM6 10a4 4 0 1 1 4-4 4 4 0 0 1-4 4z"></path></svg>
|
|
Before Width: | Height: | Size: 472 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M8.707 7.293l-5-5a1 1 0 0 0-1.414 1.414L6.586 8l-4.293 4.293a1 1 0 1 0 1.414 1.414l5-5a1 1 0 0 0 0-1.414zm6 0l-5-5a1 1 0 0 0-1.414 1.414L12.586 8l-4.293 4.293a1 1 0 1 0 1.414 1.414l5-5a1 1 0 0 0 0-1.414z"></path></svg>
|
|
Before Width: | Height: | Size: 549 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M3 1h10a3.008 3.008 0 0 1 3 3v8a3.009 3.009 0 0 1-3 3H3a3.005 3.005 0 0 1-3-3V4a3.013 3.013 0 0 1 3-3zm11 11V4a1 1 0 0 0-1-1H8v10h5a1 1 0 0 0 1-1zM2 12a1 1 0 0 0 1 1h4V3H3a1 1 0 0 0-1 1v8z"></path><path d="M3.5 5h2a.5.5 0 0 0 0-1h-2a.5.5 0 0 0 0 1zm0 2h2a.5.5 0 0 0 0-1h-2a.5.5 0 0 0 0 1zm1 2h1a.5.5 0 0 0 0-1h-1a.5.5 0 0 0 0 1z"></path></svg>
|
|
Before Width: | Height: | Size: 674 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M6.2 2s.5-.5 1.06 0c.5.5 0 1 0 1l-4.6 4.61s-2.5 2.5 0 5 5 0 5 0L13.8 6.4s1.6-1.6 0-3.2-3.2 0-3.2 0L5.8 8s-.7.7 0 1.4 1.4 0 1.4 0l3.9-3.9s.6-.5 1 0c.5.5 0 1 0 1l-3.8 4s-1.8 1.8-3.5 0C3 8.7 4.8 7 4.8 7l4.7-4.9s2.7-2.6 5.3 0c2.6 2.6 0 5.3 0 5.3l-6.2 6.3s-3.5 3.5-7 0 0-7 0-7z"/></svg>
|
|
Before Width: | Height: | Size: 380 B |
|
@ -1 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 4.233 4.233" height="16" width="16" fill="rgba(255,255,255,1)"><path d="M.15 2.992c-.198.1-.2.266-.002.365l1.604.802a.93.93 0 00.729-.001l1.602-.801c.198-.1.197-.264 0-.364l-.695-.348c-1.306.595-2.542 0-2.542 0m-.264.53l.658-.329c.6.252 1.238.244 1.754 0l.659.329-1.536.768zM.15 1.935c-.198.1-.198.265 0 .364l1.604.802a.926.926 0 00.727 0l1.603-.802c.198-.099.198-.264 0-.363l-.694-.35c-1.14.56-2.546.001-2.546.001m-.264.53l.664-.332c.52.266 1.261.235 1.75.002l.659.33-1.537.768zM.15.877c-.198.099-.198.264 0 .363l1.604.802a.926.926 0 00.727 0l1.603-.802c.198-.099.198-.264 0-.363L2.481.075a.926.926 0 00-.727 0zm.43.182L2.117.29l1.538.769-1.538.768z"/></svg>
|
|
Before Width: | Height: | Size: 712 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M14 9H8c-1.3 0-1.3 2 0 2h6c1.3 0 1.3-2 0-2zm0-8H5C3.7 1 3.7 3 5 3h9c1.3 0 1.3-2 0-2zM2 1C1 1 .7 2 1.3 2.7 2 3.3 3 3 3 2c0-.5-.4-1-1-1zm3 8c-1 0-1.3 1-.7 1.7.6.6 1.7.2 1.7-.7 0-.5-.4-1-1-1zM14 5H5C3.6 5 3.6 7 5 7h9c1.3 0 1.3-2 0-2zM2 5c-.9 0-1.4 1-.7 1.7C2 7.3 3 6.9 3 6c0-.6-.5-1-1-1zM14 13H5c-1.3 0-1.3 2 0 2h9c1.3 0 1.3-2 0-2zM2 13c-1 0-1.3 1-.7 1.7.7.6 1.7.2 1.7-.712 0-.5-.4-1-1-1z"/></svg>
|
|
Before Width: | Height: | Size: 493 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><g style="--darkreader-inline-fill:rgba(81, 82, 83, 0.8);" data-darkreader-inline-fill=""><rect x="1" y="1" width="6" height="6" rx="1" ry="1"></rect><rect x="9" y="1" width="6" height="6" rx="1" ry="1"></rect><rect x="1" y="9" width="6" height="6" rx="1" ry="1"></rect><rect x="9" y="9" width="6" height="6" rx="1" ry="1"></rect></g></svg>
|
|
Before Width: | Height: | Size: 662 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M14 7H9V2a1 1 0 0 0-2 0v5H2a1 1 0 0 0 0 2h5v5a1 1 0 0 0 2 0V9h5a1 1 0 0 0 0-2z"></path></svg>
|
|
Before Width: | Height: | Size: 424 B |
|
@ -1,5 +0,0 @@
|
||||||
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
|
||||||
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
|
||||||
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
|
||||||
fill="rgba(255,255,255,1)"><rect x="2" y="7" width="12" height="2" rx="1"></rect></svg>
|
|
Before Width: | Height: | Size: 382 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M13 9L6 5v8z"/></svg>
|
|
Before Width: | Height: | Size: 120 B |
|
@ -1,2 +0,0 @@
|
||||||
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
|
||||||
fill="rgba(255,255,255,1)"><path d="M10 13l4-7H6z"/></svg>
|
|
Before Width: | Height: | Size: 121 B |