mirror of
https://github.com/janeczku/calibre-web
synced 2024-11-24 18:47:23 +00:00
Merge remote-tracking branch 'upstream/master'
This commit is contained in:
commit
2f69e3141e
15
.github/ISSUE_TEMPLATE/bug_report.md
vendored
15
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@ -6,6 +6,7 @@ labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
||||
|
||||
**Describe the bug/problem**
|
||||
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
|
||||
@ -27,12 +28,12 @@ A clear and concise description of what you expected to happen.
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Environment (please complete the following information):**
|
||||
- OS: [e.g. Windows 10/raspian]
|
||||
- Python version [e.g. python2.7]
|
||||
- Calibre-Web version [e.g. 0.6.5 or master@16.02.20, 19:55 ]:
|
||||
- Docker container [ None/Technosoft2000/Linuxuser]:
|
||||
- Special Hardware [e.g. Rasperry Pi Zero]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
||||
- Python version: [e.g. python2.7]
|
||||
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
||||
- Docker container: [None/Technosoft2000/Linuxuser]:
|
||||
- Special Hardware: [e.g. Rasperry Pi Zero]
|
||||
- Browser: [e.g. Chrome 83.0.4103.97, Safari 13.3.7, Firefox 68.0.1 ESR]
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here. [e.g. access via reverse proxy]
|
||||
Add any other context about the problem here. [e.g. access via reverse proxy, database background sync, special database location]
|
||||
|
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@ -7,6 +7,8 @@ assignees: ''
|
||||
|
||||
---
|
||||
|
||||
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
|
||||
|
@ -1,49 +1,46 @@
|
||||
## How to contribute to Calibre-Web
|
||||
|
||||
First of all, we would like to thank you for reading this text. we are happy you are willing to contribute to Calibre-Web
|
||||
First of all, we would like to thank you for reading this text. We are happy you are willing to contribute to Calibre-Web.
|
||||
|
||||
### **General**
|
||||
|
||||
**Communication language** is english. Google translated texts are not as bad as you might think, they are usually understandable, so don't worry if you generate your post that way.
|
||||
**Communication language** is English. Google translated texts are not as bad as you might think, they are usually understandable, so don't worry if you generate your post that way.
|
||||
|
||||
**Calibre-Web** is not **Calibre**. If you are having a question regarding Calibre please post this at their [repository](https://github.com/kovidgoyal/calibre).
|
||||
|
||||
**Docker-Containers** of Calibre-Web are maintained by different persons than the people who drive Calibre-Web. If we come to the conclusion during our analysis that the problem is related to Docker, we might ask you to open a new issue at the reprository of the Docker Container.
|
||||
**Docker-Containers** of Calibre-Web are maintained by different persons than the people who drive Calibre-Web. If we come to the conclusion during our analysis that the problem is related to Docker, we might ask you to open a new issue at the repository of the Docker Container.
|
||||
|
||||
If you are having **Basic Installation Problems** with python or it's dependencies, please consider using your favorite search engine to find a solution. In case you can't find a solution, we are happy to help you.
|
||||
If you are having **Basic Installation Problems** with python or its dependencies, please consider using your favorite search engine to find a solution. In case you can't find a solution, we are happy to help you.
|
||||
|
||||
We can offer only very limited support regarding configuration of **Reverse-Proxy Installations**, **OPDS-Reader** or other programs in combination with Calibre-Web.
|
||||
We can offer only very limited support regarding configuration of **Reverse-Proxy Installations**, **OPDS-Reader** or other programs in combination with Calibre-Web.
|
||||
|
||||
### **Translation**
|
||||
|
||||
Some of the user languages in Calibre-Web having missing translations. We are happy to add the missing texts if you translate them. Create a Pull Request, create an issue with the .po file attached, or write an email to "ozzie.fernandez.isaacs@googlemail.com" with attached translation file. To display all book languages in your native language an additional file is used (iso_language_names.py). The content of this file is autogenerated with the corresponding translations of Calibre, please do not edit this file on your own.
|
||||
Some of the user languages in Calibre-Web having missing translations. We are happy to add the missing texts if you translate them. Create a Pull Request, create an issue with the .po file attached, or write an email to "ozzie.fernandez.isaacs@googlemail.com" with attached translation file. To display all book languages in your native language an additional file is used (iso_language_names.py). The content of this file is auto-generated with the corresponding translations of Calibre, please do not edit this file on your own.
|
||||
|
||||
### **Documentation**
|
||||
|
||||
The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/janeczku/calibre-web/wiki). The Wiki is open to everybody, if you find a problem, feel free to correct it. If information is missing, you are welcome to add it. The content will be reviewed time by time. Please try to be consitent with the form with the other Wiki pages (e.g. the project name is Calibre-Web with 2 capital letters and a dash in between).
|
||||
The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/janeczku/calibre-web/wiki). The Wiki is open to everybody, if you find a problem, feel free to correct it. If information is missing, you are welcome to add it. The content will be reviewed time by time. Please try to be consistent with the form with the other Wiki pages (e.g. the project name is Calibre-Web with 2 capital letters and a dash in between).
|
||||
|
||||
### **Reporting a bug**
|
||||
|
||||
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Please write intead an email to "ozzie.fernandez.isaacs@googlemail.com".
|
||||
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Instead, please write an email to "ozzie.fernandez.isaacs@googlemail.com".
|
||||
|
||||
Ensure the ***bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
|
||||
|
||||
If you're unable to find an **open issue** addressing the problem, open a [new one](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=bug_report.md&title=). Be sure to include a **title** and **clear description**, as much relevant information as possible, the **issue form** helps you providing the right information. Deleting the form and just pasting the stack trace doesn't speed up fixing the problem. If your issue could be resolved, consider closing the issue.
|
||||
If you're unable to find an **open issue** addressing the problem, open a [new one](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=bug_report.md&title=). Be sure to include a **title** and **clear description**, as much relevant information as possible, the **issue form** helps you providing the right information. Deleting the form and just pasting the stack trace doesn't speed up fixing the problem. If your issue could be resolved, consider closing the issue.
|
||||
|
||||
### **Feature Request**
|
||||
|
||||
If there is a feature missing in Calibre-Web and you can't find a feature request in the [Issues](https://github.com/janeczku/calibre-web/issues) section, you could create a [feature request](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=feature_request.md&title=).
|
||||
We will not extend Calibre-Web with any more login abilitys or add further files storages, or file syncing ability. Furthermore Calibre-Web is made for home usage for company inhouse usage, so requests regarding any sorts of social interaction capability, payment routines, search engine or web site analytics integration will not be implemeted.
|
||||
If there is a feature missing in Calibre-Web and you can't find a feature request in the [Issues](https://github.com/janeczku/calibre-web/issues) section, you could create a [feature request](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=feature_request.md&title=).
|
||||
We will not extend Calibre-Web with any more login abilities or add further files storages, or file syncing ability. Furthermore Calibre-Web is made for home usage for company in-house usage, so requests regarding any sorts of social interaction capability, payment routines, search engine or web site analytics integration will not be implemented.
|
||||
|
||||
### **Contributing code to Calibre-Web**
|
||||
|
||||
Open a new GitHub pull request with the patch. Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
|
||||
|
||||
In case your code enhances features of Calibre-Web: Create your pull request for the development branch if your enhancement consits of more than some lines of code in a local section of Calibre-Webs code. This makes it easier to test it and check all implication before it's made public.
|
||||
|
||||
Please check if your code runs on Python 2.7 (still necessary in 2020) and mainly on python 3. If possible and the feature is related to operating system functions, try to check it on Windows and Linux.
|
||||
Calibre-Web is automatically tested on Linux in combination with python 3.7. The code for testing is in a [seperate repo](https://github.com/OzzieIsaacs/calibre-web-test) on Github. It uses unittest and performs real system tests with selenium, would be great if you could consider also writing some tests.
|
||||
A static code analysis is done by Codacy, but it's partly broken and doesn't run automatically. You could check your code with ESLint before contributing, a configuration file can be found in the projects root folder.
|
||||
|
||||
|
||||
In case your code enhances features of Calibre-Web: Create your pull request for the development branch if your enhancement consists of more than some lines of code in a local section of Calibre-Webs code. This makes it easier to test it and check all implication before it's made public.
|
||||
|
||||
Please check if your code runs on Python 2.7 (still necessary in 2020) and mainly on python 3. If possible and the feature is related to operating system functions, try to check it on Windows and Linux.
|
||||
Calibre-Web is automatically tested on Linux in combination with python 3.7. The code for testing is in a [separate repo](https://github.com/OzzieIsaacs/calibre-web-test) on Github. It uses unit tests and performs real system tests with selenium; it would be great if you could consider also writing some tests.
|
||||
A static code analysis is done by Codacy, but it's partly broken and doesn't run automatically. You could check your code with ESLint before contributing, a configuration file can be found in the projects root folder.
|
||||
|
@ -61,14 +61,14 @@ Optionally, to enable on-the-fly conversion from one ebook format to another whe
|
||||
Pre-built Docker images are available in these Docker Hub repositories:
|
||||
|
||||
#### **Technosoft2000 - x64**
|
||||
+ Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web/](https://hub.docker.com/r/technosoft2000/calibre-web/)
|
||||
+ Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web](https://hub.docker.com/r/technosoft2000/calibre-web)
|
||||
+ Github - [https://github.com/Technosoft2000/docker-calibre-web](https://github.com/Technosoft2000/docker-calibre-web)
|
||||
|
||||
Includes the Calibre `ebook-convert` binary.
|
||||
+ The "path to convertertool" should be set to `/opt/calibre/ebook-convert`
|
||||
|
||||
#### **LinuxServer - x64, armhf, aarch64**
|
||||
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web/](https://hub.docker.com/r/linuxserver/calibre-web/)
|
||||
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web](https://hub.docker.com/r/linuxserver/calibre-web)
|
||||
+ Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web)
|
||||
+ Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre)
|
||||
|
||||
@ -83,3 +83,7 @@ Pre-built Docker images are available in these Docker Hub repositories:
|
||||
# Wiki
|
||||
|
||||
For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki)
|
||||
|
||||
# Contributing to Calibre-Web
|
||||
|
||||
Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
||||
|
152
cps/admin.py
152
cps/admin.py
@ -34,7 +34,7 @@ from flask import Blueprint, flash, redirect, url_for, abort, request, make_resp
|
||||
from flask_login import login_required, current_user, logout_user
|
||||
from flask_babel import gettext as _
|
||||
from sqlalchemy import and_
|
||||
from sqlalchemy.exc import IntegrityError
|
||||
from sqlalchemy.exc import IntegrityError, OperationalError, InvalidRequestError
|
||||
from sqlalchemy.sql.expression import func
|
||||
|
||||
from . import constants, logger, helper, services
|
||||
@ -99,7 +99,7 @@ def shutdown():
|
||||
|
||||
if task == 2:
|
||||
log.warning("reconnecting to calibre database")
|
||||
calibre_db.setup_db(config, ub.app_DB_path)
|
||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||
showtext['text'] = _(u'Reconnect successful')
|
||||
return json.dumps(showtext)
|
||||
|
||||
@ -132,6 +132,7 @@ def admin():
|
||||
allUser = ub.session.query(ub.User).all()
|
||||
email_settings = config.get_mail_settings()
|
||||
return render_title_template("admin.html", allUser=allUser, email=email_settings, config=config, commit=commit,
|
||||
feature_support=feature_support,
|
||||
title=_(u"Admin page"), page="admin")
|
||||
|
||||
|
||||
@ -603,7 +604,7 @@ def _configuration_ldap_helper(to_save, gdriveError):
|
||||
return reboot_required, _configuration_result(_('LDAP User Object Filter Has Unmatched Parenthesis'),
|
||||
gdriveError)
|
||||
|
||||
if config.config_ldap_cert_path and not os.path.isdir(config.config_ldap_cert_path):
|
||||
if config.config_ldap_cert_path and not os.path.isfile(config.config_ldap_cert_path):
|
||||
return reboot_required, _configuration_result(_('LDAP Certificate Location is not Valid, Please Enter Correct Path'),
|
||||
gdriveError)
|
||||
return reboot_required, None
|
||||
@ -613,80 +614,90 @@ def _configuration_update_helper():
|
||||
reboot_required = False
|
||||
db_change = False
|
||||
to_save = request.form.to_dict()
|
||||
gdriveError = None
|
||||
|
||||
to_save['config_calibre_dir'] = re.sub('[\\/]metadata\.db$', '', to_save['config_calibre_dir'], flags=re.IGNORECASE)
|
||||
db_change |= _config_string(to_save, "config_calibre_dir")
|
||||
try:
|
||||
db_change |= _config_string(to_save, "config_calibre_dir")
|
||||
|
||||
# Google drive setup
|
||||
gdriveError = _configuration_gdrive_helper(to_save)
|
||||
# Google drive setup
|
||||
gdriveError = _configuration_gdrive_helper(to_save)
|
||||
|
||||
reboot_required |= _config_int(to_save, "config_port")
|
||||
reboot_required |= _config_int(to_save, "config_port")
|
||||
|
||||
reboot_required |= _config_string(to_save, "config_keyfile")
|
||||
if config.config_keyfile and not os.path.isfile(config.config_keyfile):
|
||||
return _configuration_result(_('Keyfile Location is not Valid, Please Enter Correct Path'), gdriveError)
|
||||
reboot_required |= _config_string(to_save, "config_keyfile")
|
||||
if config.config_keyfile and not os.path.isfile(config.config_keyfile):
|
||||
return _configuration_result(_('Keyfile Location is not Valid, Please Enter Correct Path'), gdriveError)
|
||||
|
||||
reboot_required |= _config_string(to_save, "config_certfile")
|
||||
if config.config_certfile and not os.path.isfile(config.config_certfile):
|
||||
return _configuration_result(_('Certfile Location is not Valid, Please Enter Correct Path'), gdriveError)
|
||||
reboot_required |= _config_string(to_save, "config_certfile")
|
||||
if config.config_certfile and not os.path.isfile(config.config_certfile):
|
||||
return _configuration_result(_('Certfile Location is not Valid, Please Enter Correct Path'), gdriveError)
|
||||
|
||||
_config_checkbox_int(to_save, "config_uploading")
|
||||
_config_checkbox_int(to_save, "config_anonbrowse")
|
||||
_config_checkbox_int(to_save, "config_public_reg")
|
||||
_config_checkbox_int(to_save, "config_register_email")
|
||||
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
||||
_config_checkbox_int(to_save, "config_kobo_proxy")
|
||||
_config_checkbox_int(to_save, "config_uploading")
|
||||
_config_checkbox_int(to_save, "config_anonbrowse")
|
||||
_config_checkbox_int(to_save, "config_public_reg")
|
||||
_config_checkbox_int(to_save, "config_register_email")
|
||||
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
||||
_config_int(to_save, "config_external_port")
|
||||
_config_checkbox_int(to_save, "config_kobo_proxy")
|
||||
|
||||
_config_string(to_save, "config_upload_formats")
|
||||
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip() for x in config.config_upload_formats.split(',')]
|
||||
if "config_upload_formats" in to_save:
|
||||
to_save["config_upload_formats"] = ','.join(
|
||||
helper.uniq([x.lstrip().rstrip().lower() for x in to_save["config_upload_formats"].split(',')]))
|
||||
_config_string(to_save, "config_upload_formats")
|
||||
constants.EXTENSIONS_UPLOAD = config.config_upload_formats.split(',')
|
||||
|
||||
_config_string(to_save, "config_calibre")
|
||||
_config_string(to_save, "config_converterpath")
|
||||
_config_string(to_save, "config_kepubifypath")
|
||||
_config_string(to_save, "config_calibre")
|
||||
_config_string(to_save, "config_converterpath")
|
||||
_config_string(to_save, "config_kepubifypath")
|
||||
|
||||
reboot_required |= _config_int(to_save, "config_login_type")
|
||||
reboot_required |= _config_int(to_save, "config_login_type")
|
||||
|
||||
#LDAP configurator,
|
||||
if config.config_login_type == constants.LOGIN_LDAP:
|
||||
reboot, message = _configuration_ldap_helper(to_save, gdriveError)
|
||||
#LDAP configurator,
|
||||
if config.config_login_type == constants.LOGIN_LDAP:
|
||||
reboot, message = _configuration_ldap_helper(to_save, gdriveError)
|
||||
if message:
|
||||
return message
|
||||
reboot_required |= reboot
|
||||
|
||||
# Remote login configuration
|
||||
|
||||
_config_checkbox(to_save, "config_remote_login")
|
||||
if not config.config_remote_login:
|
||||
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.token_type==0).delete()
|
||||
|
||||
# Goodreads configuration
|
||||
_config_checkbox(to_save, "config_use_goodreads")
|
||||
_config_string(to_save, "config_goodreads_api_key")
|
||||
_config_string(to_save, "config_goodreads_api_secret")
|
||||
if services.goodreads_support:
|
||||
services.goodreads_support.connect(config.config_goodreads_api_key,
|
||||
config.config_goodreads_api_secret,
|
||||
config.config_use_goodreads)
|
||||
|
||||
_config_int(to_save, "config_updatechannel")
|
||||
|
||||
# Reverse proxy login configuration
|
||||
_config_checkbox(to_save, "config_allow_reverse_proxy_header_login")
|
||||
_config_string(to_save, "config_reverse_proxy_login_header_name")
|
||||
|
||||
# OAuth configuration
|
||||
if config.config_login_type == constants.LOGIN_OAUTH:
|
||||
reboot_required |= _configuration_oauth_helper(to_save)
|
||||
|
||||
reboot, message = _configuration_logfile_helper(to_save, gdriveError)
|
||||
if message:
|
||||
return message
|
||||
reboot_required |= reboot
|
||||
|
||||
# Remote login configuration
|
||||
_config_checkbox(to_save, "config_remote_login")
|
||||
if not config.config_remote_login:
|
||||
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.token_type==0).delete()
|
||||
|
||||
# Goodreads configuration
|
||||
_config_checkbox(to_save, "config_use_goodreads")
|
||||
_config_string(to_save, "config_goodreads_api_key")
|
||||
_config_string(to_save, "config_goodreads_api_secret")
|
||||
if services.goodreads_support:
|
||||
services.goodreads_support.connect(config.config_goodreads_api_key,
|
||||
config.config_goodreads_api_secret,
|
||||
config.config_use_goodreads)
|
||||
|
||||
_config_int(to_save, "config_updatechannel")
|
||||
|
||||
# Reverse proxy login configuration
|
||||
_config_checkbox(to_save, "config_allow_reverse_proxy_header_login")
|
||||
_config_string(to_save, "config_reverse_proxy_login_header_name")
|
||||
|
||||
# OAuth configuration
|
||||
if config.config_login_type == constants.LOGIN_OAUTH:
|
||||
reboot_required |= _configuration_oauth_helper(to_save)
|
||||
|
||||
reboot, message = _configuration_logfile_helper(to_save, gdriveError)
|
||||
if message:
|
||||
return message
|
||||
reboot_required |= reboot
|
||||
# Rarfile Content configuration
|
||||
_config_string(to_save, "config_rarfile_location")
|
||||
if "config_rarfile_location" in to_save:
|
||||
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
||||
if unrar_status:
|
||||
return _configuration_result(unrar_status, gdriveError)
|
||||
# Rarfile Content configuration
|
||||
_config_string(to_save, "config_rarfile_location")
|
||||
if "config_rarfile_location" in to_save:
|
||||
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
||||
if unrar_status:
|
||||
return _configuration_result(unrar_status, gdriveError)
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
_configuration_result(_(u"Settings DB is not Writeable"), gdriveError)
|
||||
|
||||
try:
|
||||
metadata_db = os.path.join(config.config_calibre_dir, "metadata.db")
|
||||
@ -719,7 +730,7 @@ def _configuration_result(error_flash=None, gdriveError=None):
|
||||
gdriveError = _(gdriveError)
|
||||
else:
|
||||
# if config.config_use_google_drive and\
|
||||
if not gdrive_authenticate:
|
||||
if not gdrive_authenticate and gdrive_support:
|
||||
gdrivefolders = gdriveutils.listRootFolders()
|
||||
|
||||
show_back_button = current_user.is_authenticated
|
||||
@ -783,6 +794,9 @@ def _handle_new_user(to_save, content,languages, translations, kobo_support):
|
||||
except IntegrityError:
|
||||
ub.session.rollback()
|
||||
flash(_(u"Found an existing account for this e-mail address or nickname."), category="error")
|
||||
except OperationalError:
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
|
||||
|
||||
def _handle_edit_user(to_save, content,languages, translations, kobo_support, downloads):
|
||||
@ -872,6 +886,9 @@ def _handle_edit_user(to_save, content,languages, translations, kobo_support, do
|
||||
except IntegrityError:
|
||||
ub.session.rollback()
|
||||
flash(_(u"An unknown error occured."), category="error")
|
||||
except OperationalError:
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
|
||||
|
||||
@admi.route("/admin/user/new", methods=["GET", "POST"])
|
||||
@ -916,7 +933,12 @@ def update_mailsettings():
|
||||
_config_string(to_save, "mail_password")
|
||||
_config_string(to_save, "mail_from")
|
||||
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
||||
config.save()
|
||||
try:
|
||||
config.save()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
return edit_mailsettings()
|
||||
|
||||
if to_save.get("test"):
|
||||
if current_user.email:
|
||||
|
24
cps/comic.py
24
cps/comic.py
@ -74,10 +74,10 @@ def _cover_processing(tmp_file_name, img, extension):
|
||||
|
||||
|
||||
|
||||
def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
||||
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
||||
cover_data = extension = None
|
||||
if use_comic_meta:
|
||||
archive = ComicArchive(tmp_file_name)
|
||||
archive = ComicArchive(tmp_file_name, rar_exe_path=rarExecutable)
|
||||
for index, name in enumerate(archive.getPageNameList()):
|
||||
ext = os.path.splitext(name)
|
||||
if len(ext) > 1:
|
||||
@ -106,7 +106,7 @@ def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
||||
break
|
||||
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
||||
try:
|
||||
rarfile.UNRAR_TOOL = rarExceutable
|
||||
rarfile.UNRAR_TOOL = rarExecutable
|
||||
cf = rarfile.RarFile(tmp_file_name)
|
||||
for name in cf.getnames():
|
||||
ext = os.path.splitext(name)
|
||||
@ -120,9 +120,9 @@ def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
||||
return _cover_processing(tmp_file_name, cover_data, extension)
|
||||
|
||||
|
||||
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExceutable):
|
||||
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExecutable):
|
||||
if use_comic_meta:
|
||||
archive = ComicArchive(tmp_file_path, rar_exe_path=rarExceutable)
|
||||
archive = ComicArchive(tmp_file_path, rar_exe_path=rarExecutable)
|
||||
if archive.seemsToBeAComicArchive():
|
||||
if archive.hasMetadata(MetaDataStyle.CIX):
|
||||
style = MetaDataStyle.CIX
|
||||
@ -134,21 +134,15 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||
# if style is not None:
|
||||
loadedMetadata = archive.readMetadata(style)
|
||||
|
||||
lang = loadedMetadata.language
|
||||
if lang:
|
||||
if len(lang) == 2:
|
||||
loadedMetadata.language = isoLanguages.get(part1=lang).name
|
||||
elif len(lang) == 3:
|
||||
loadedMetadata.language = isoLanguages.get(part3=lang).name
|
||||
else:
|
||||
loadedMetadata.language = ""
|
||||
lang = loadedMetadata.language or ""
|
||||
loadedMetadata.language = isoLanguages.get_lang3(lang)
|
||||
|
||||
return BookMeta(
|
||||
file_path=tmp_file_path,
|
||||
extension=original_file_extension,
|
||||
title=loadedMetadata.title or original_file_name,
|
||||
author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExceutable),
|
||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
||||
description=loadedMetadata.comments or "",
|
||||
tags="",
|
||||
series=loadedMetadata.series or "",
|
||||
@ -160,7 +154,7 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||
extension=original_file_extension,
|
||||
title=original_file_name,
|
||||
author=u'Unknown',
|
||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExceutable),
|
||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
||||
description="",
|
||||
tags="",
|
||||
series="",
|
||||
|
@ -22,7 +22,7 @@ import os
|
||||
import json
|
||||
import sys
|
||||
|
||||
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB
|
||||
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
|
||||
from . import constants, cli, logger, ub
|
||||
@ -57,6 +57,7 @@ class _Settings(_Base):
|
||||
|
||||
config_calibre_dir = Column(String)
|
||||
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||
config_external_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||
config_certfile = Column(String)
|
||||
config_keyfile = Column(String)
|
||||
|
||||
@ -92,7 +93,7 @@ class _Settings(_Base):
|
||||
|
||||
config_use_google_drive = Column(Boolean, default=False)
|
||||
config_google_drive_folder = Column(String)
|
||||
config_google_drive_watch_changes_response = Column(String)
|
||||
config_google_drive_watch_changes_response = Column(JSON, default={})
|
||||
|
||||
config_use_goodreads = Column(Boolean, default=False)
|
||||
config_goodreads_api_key = Column(String)
|
||||
@ -102,7 +103,6 @@ class _Settings(_Base):
|
||||
|
||||
config_kobo_proxy = Column(Boolean, default=False)
|
||||
|
||||
|
||||
config_ldap_provider_url = Column(String, default='example.org')
|
||||
config_ldap_port = Column(SmallInteger, default=389)
|
||||
config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
|
||||
@ -215,20 +215,20 @@ class _ConfigSQL(object):
|
||||
return self.show_element_new_user(constants.DETAIL_RANDOM)
|
||||
|
||||
def list_denied_tags(self):
|
||||
mct = self.config_denied_tags.split(",")
|
||||
return [t.strip() for t in mct]
|
||||
mct = self.config_denied_tags or ""
|
||||
return [t.strip() for t in mct.split(",")]
|
||||
|
||||
def list_allowed_tags(self):
|
||||
mct = self.config_allowed_tags.split(",")
|
||||
return [t.strip() for t in mct]
|
||||
mct = self.config_allowed_tags or ""
|
||||
return [t.strip() for t in mct.split(",")]
|
||||
|
||||
def list_denied_column_values(self):
|
||||
mct = self.config_denied_column_value.split(",")
|
||||
return [t.strip() for t in mct]
|
||||
mct = self.config_denied_column_value or ""
|
||||
return [t.strip() for t in mct.split(",")]
|
||||
|
||||
def list_allowed_column_values(self):
|
||||
mct = self.config_allowed_column_value.split(",")
|
||||
return [t.strip() for t in mct]
|
||||
mct = self.config_allowed_column_value or ""
|
||||
return [t.strip() for t in mct.split(",")]
|
||||
|
||||
def get_log_level(self):
|
||||
return logger.get_level_name(self.config_log_level)
|
||||
@ -281,10 +281,6 @@ class _ConfigSQL(object):
|
||||
v = column.default.arg
|
||||
setattr(self, k, v)
|
||||
|
||||
if self.config_google_drive_watch_changes_response:
|
||||
self.config_google_drive_watch_changes_response = \
|
||||
json.loads(self.config_google_drive_watch_changes_response)
|
||||
|
||||
have_metadata_db = bool(self.config_calibre_dir)
|
||||
if have_metadata_db:
|
||||
if not self.config_use_google_drive:
|
||||
@ -303,10 +299,6 @@ class _ConfigSQL(object):
|
||||
'''Apply all configuration values to the underlying storage.'''
|
||||
s = self._read_from_storage() # type: _Settings
|
||||
|
||||
if self.config_google_drive_watch_changes_response:
|
||||
self.config_google_drive_watch_changes_response = json.dumps(
|
||||
self.config_google_drive_watch_changes_response)
|
||||
|
||||
for k, v in self.__dict__.items():
|
||||
if k[0] == '_':
|
||||
continue
|
||||
@ -361,10 +353,10 @@ def _migrate_table(session, orm_class):
|
||||
|
||||
def autodetect_calibre_binary():
|
||||
if sys.platform == "win32":
|
||||
calibre_path = ["C:\\program files\calibre\ebook-convert.exe",
|
||||
"C:\\program files(x86)\calibre\ebook-convert.exe",
|
||||
"C:\\program files(x86)\calibre2\ebook-convert.exe",
|
||||
"C:\\program files\calibre2\ebook-convert.exe"]
|
||||
calibre_path = ["C:\\program files\\calibre\\ebook-convert.exe",
|
||||
"C:\\program files(x86)\\calibre\\ebook-convert.exe",
|
||||
"C:\\program files(x86)\\calibre2\\ebook-convert.exe",
|
||||
"C:\\program files\\calibre2\\ebook-convert.exe"]
|
||||
else:
|
||||
calibre_path = ["/opt/calibre/ebook-convert"]
|
||||
for element in calibre_path:
|
||||
|
@ -128,7 +128,7 @@ def selected_roles(dictionary):
|
||||
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
||||
'series_id, languages')
|
||||
|
||||
STABLE_VERSION = {'version': '0.6.9 Beta'}
|
||||
STABLE_VERSION = {'version': '0.6.10 Beta'}
|
||||
|
||||
NIGHTLY_VERSION = {}
|
||||
NIGHTLY_VERSION[0] = '$Format:%H$'
|
||||
|
136
cps/db.py
136
cps/db.py
@ -32,8 +32,10 @@ from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
||||
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.exc import OperationalError
|
||||
from sqlalchemy.pool import StaticPool
|
||||
from flask_login import current_user
|
||||
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
|
||||
from sqlalchemy.ext.associationproxy import association_proxy
|
||||
from babel import Locale as LC
|
||||
from babel.core import UnknownLocaleError
|
||||
from flask_babel import gettext as _
|
||||
@ -98,41 +100,61 @@ class Identifiers(Base):
|
||||
self.book = book
|
||||
|
||||
def formatType(self):
|
||||
if self.type == "amazon":
|
||||
format_type = self.type.lower()
|
||||
if format_type == 'amazon':
|
||||
return u"Amazon"
|
||||
elif self.type == "isbn":
|
||||
elif format_type.startswith("amazon_"):
|
||||
return u"Amazon.{0}".format(format_type[7:])
|
||||
elif format_type == "isbn":
|
||||
return u"ISBN"
|
||||
elif self.type == "doi":
|
||||
elif format_type == "doi":
|
||||
return u"DOI"
|
||||
elif self.type == "goodreads":
|
||||
elif format_type == "douban":
|
||||
return u"Douban"
|
||||
elif format_type == "goodreads":
|
||||
return u"Goodreads"
|
||||
elif self.type == "google":
|
||||
elif format_type == "google":
|
||||
return u"Google Books"
|
||||
elif self.type == "kobo":
|
||||
elif format_type == "kobo":
|
||||
return u"Kobo"
|
||||
if self.type == "lubimyczytac":
|
||||
elif format_type == "litres":
|
||||
return u"ЛитРес"
|
||||
elif format_type == "issn":
|
||||
return u"ISSN"
|
||||
elif format_type == "isfdb":
|
||||
return u"ISFDB"
|
||||
if format_type == "lubimyczytac":
|
||||
return u"Lubimyczytac"
|
||||
else:
|
||||
return self.type
|
||||
|
||||
def __repr__(self):
|
||||
if self.type == "amazon" or self.type == "asin":
|
||||
return u"https://amzn.com/{0}".format(self.val)
|
||||
elif self.type == "isbn":
|
||||
format_type = self.type.lower()
|
||||
if format_type == "amazon" or format_type == "asin":
|
||||
return u"https://amazon.com/dp/{0}".format(self.val)
|
||||
elif format_type.startswith('amazon_'):
|
||||
return u"https://amazon.{0}/dp/{1}".format(format_type[7:], self.val)
|
||||
elif format_type == "isbn":
|
||||
return u"https://www.worldcat.org/isbn/{0}".format(self.val)
|
||||
elif self.type == "doi":
|
||||
elif format_type == "doi":
|
||||
return u"https://dx.doi.org/{0}".format(self.val)
|
||||
elif self.type == "goodreads":
|
||||
elif format_type == "goodreads":
|
||||
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
||||
elif self.type == "douban":
|
||||
elif format_type == "douban":
|
||||
return u"https://book.douban.com/subject/{0}".format(self.val)
|
||||
elif self.type == "google":
|
||||
elif format_type == "google":
|
||||
return u"https://books.google.com/books?id={0}".format(self.val)
|
||||
elif self.type == "kobo":
|
||||
elif format_type == "kobo":
|
||||
return u"https://www.kobo.com/ebook/{0}".format(self.val)
|
||||
elif self.type == "lubimyczytac":
|
||||
return u" https://lubimyczytac.pl/ksiazka/{0}".format(self.val)
|
||||
elif self.type == "url":
|
||||
elif format_type == "lubimyczytac":
|
||||
return u" https://lubimyczytac.pl/ksiazka/{0}/ksiazka".format(self.val)
|
||||
elif format_type == "litres":
|
||||
return u"https://www.litres.ru/{0}".format(self.val)
|
||||
elif format_type == "issn":
|
||||
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
||||
elif format_type == "isfdb":
|
||||
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
||||
elif format_type == "url":
|
||||
return u"{0}".format(self.val)
|
||||
else:
|
||||
return u""
|
||||
@ -280,7 +302,7 @@ class Books(Base):
|
||||
flags = Column(Integer, nullable=False, default=1)
|
||||
|
||||
authors = relationship('Authors', secondary=books_authors_link, backref='books')
|
||||
tags = relationship('Tags', secondary=books_tags_link, backref='books',order_by="Tags.name")
|
||||
tags = relationship('Tags', secondary=books_tags_link, backref='books', order_by="Tags.name")
|
||||
comments = relationship('Comments', backref='books')
|
||||
data = relationship('Data', backref='books')
|
||||
series = relationship('Series', secondary=books_series_link, backref='books')
|
||||
@ -370,7 +392,6 @@ class CalibreDB(threading.Thread):
|
||||
def setup_db(self, config, app_db_path):
|
||||
self.config = config
|
||||
self.dispose()
|
||||
# global engine
|
||||
|
||||
if not config.config_calibre_dir:
|
||||
config.invalidate()
|
||||
@ -382,11 +403,11 @@ class CalibreDB(threading.Thread):
|
||||
return False
|
||||
|
||||
try:
|
||||
#engine = create_engine('sqlite:///{0}'.format(dbpath),
|
||||
self.engine = create_engine('sqlite://',
|
||||
echo=False,
|
||||
isolation_level="SERIALIZABLE",
|
||||
connect_args={'check_same_thread': False})
|
||||
connect_args={'check_same_thread': False},
|
||||
poolclass=StaticPool)
|
||||
self.engine.execute("attach database '{}' as calibre;".format(dbpath))
|
||||
self.engine.execute("attach database '{}' as app_settings;".format(app_db_path))
|
||||
|
||||
@ -406,34 +427,46 @@ class CalibreDB(threading.Thread):
|
||||
books_custom_column_links = {}
|
||||
for row in cc:
|
||||
if row.datatype not in cc_exceptions:
|
||||
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link', Base.metadata,
|
||||
Column('book', Integer, ForeignKey('books.id'),
|
||||
primary_key=True),
|
||||
Column('value', Integer,
|
||||
ForeignKey('custom_column_' + str(row.id) + '.id'),
|
||||
primary_key=True)
|
||||
)
|
||||
cc_ids.append([row.id, row.datatype])
|
||||
if row.datatype == 'bool':
|
||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||
'id': Column(Integer, primary_key=True),
|
||||
'book': Column(Integer, ForeignKey('books.id')),
|
||||
'value': Column(Boolean)}
|
||||
elif row.datatype == 'int':
|
||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||
'id': Column(Integer, primary_key=True),
|
||||
'book': Column(Integer, ForeignKey('books.id')),
|
||||
'value': Column(Integer)}
|
||||
elif row.datatype == 'float':
|
||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||
'id': Column(Integer, primary_key=True),
|
||||
'book': Column(Integer, ForeignKey('books.id')),
|
||||
'value': Column(Float)}
|
||||
if row.datatype == 'series':
|
||||
dicttable = {'__tablename__': 'books_custom_column_' + str(row.id) + '_link',
|
||||
'id': Column(Integer, primary_key=True),
|
||||
'book': Column(Integer, ForeignKey('books.id'),
|
||||
primary_key=True),
|
||||
'map_value': Column('value', Integer,
|
||||
ForeignKey('custom_column_' +
|
||||
str(row.id) + '.id'),
|
||||
primary_key=True),
|
||||
'extra': Column(Float),
|
||||
'asoc' : relationship('custom_column_' + str(row.id), uselist=False),
|
||||
'value' : association_proxy('asoc', 'value')
|
||||
}
|
||||
books_custom_column_links[row.id] = type(str('books_custom_column_' + str(row.id) + '_link'),
|
||||
(Base,), dicttable)
|
||||
else:
|
||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||
'id': Column(Integer, primary_key=True),
|
||||
'value': Column(String)}
|
||||
cc_classes[row.id] = type(str('Custom_Column_' + str(row.id)), (Base,), ccdict)
|
||||
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link',
|
||||
Base.metadata,
|
||||
Column('book', Integer, ForeignKey('books.id'),
|
||||
primary_key=True),
|
||||
Column('value', Integer,
|
||||
ForeignKey('custom_column_' +
|
||||
str(row.id) + '.id'),
|
||||
primary_key=True)
|
||||
)
|
||||
cc_ids.append([row.id, row.datatype])
|
||||
|
||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||
'id': Column(Integer, primary_key=True)}
|
||||
if row.datatype == 'float':
|
||||
ccdict['value'] = Column(Float)
|
||||
elif row.datatype == 'int':
|
||||
ccdict['value'] = Column(Integer)
|
||||
elif row.datatype == 'bool':
|
||||
ccdict['value'] = Column(Boolean)
|
||||
else:
|
||||
ccdict['value'] = Column(String)
|
||||
if row.datatype in ['float', 'int', 'bool']:
|
||||
ccdict['book'] = Column(Integer, ForeignKey('books.id'))
|
||||
cc_classes[row.id] = type(str('custom_column_' + str(row.id)), (Base,), ccdict)
|
||||
|
||||
for cc_id in cc_ids:
|
||||
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
|
||||
@ -443,6 +476,11 @@ class CalibreDB(threading.Thread):
|
||||
primaryjoin=(
|
||||
Books.id == cc_classes[cc_id[0]].book),
|
||||
backref='books'))
|
||||
elif (cc_id[1] == 'series'):
|
||||
setattr(Books,
|
||||
'custom_column_' + str(cc_id[0]),
|
||||
relationship(books_custom_column_links[cc_id[0]],
|
||||
backref='books'))
|
||||
else:
|
||||
setattr(Books,
|
||||
'custom_column_' + str(cc_id[0]),
|
||||
|
149
cps/editbooks.py
149
cps/editbooks.py
@ -151,8 +151,11 @@ def modify_identifiers(input_identifiers, db_identifiers, db_session):
|
||||
input_identifiers is a list of read-to-persist Identifiers objects.
|
||||
db_identifiers is a list of already persisted list of Identifiers objects."""
|
||||
changed = False
|
||||
input_dict = dict([ (identifier.type.lower(), identifier) for identifier in input_identifiers ])
|
||||
db_dict = dict([ (identifier.type.lower(), identifier) for identifier in db_identifiers ])
|
||||
error = False
|
||||
input_dict = dict([(identifier.type.lower(), identifier) for identifier in input_identifiers])
|
||||
if len(input_identifiers) != len(input_dict):
|
||||
error = True
|
||||
db_dict = dict([(identifier.type.lower(), identifier) for identifier in db_identifiers ])
|
||||
# delete db identifiers not present in input or modify them with input val
|
||||
for identifier_type, identifier in db_dict.items():
|
||||
if identifier_type not in input_dict.keys():
|
||||
@ -167,7 +170,7 @@ def modify_identifiers(input_identifiers, db_identifiers, db_session):
|
||||
if identifier_type not in db_dict.keys():
|
||||
db_session.add(identifier)
|
||||
changed = True
|
||||
return changed
|
||||
return changed, error
|
||||
|
||||
|
||||
@editbook.route("/delete/<int:book_id>/", defaults={'book_format': ""})
|
||||
@ -354,7 +357,10 @@ def edit_book_comments(comments, book):
|
||||
def edit_book_languages(languages, book, upload=False):
|
||||
input_languages = languages.split(',')
|
||||
unknown_languages = []
|
||||
input_l = isoLanguages.get_language_codes(get_locale(), input_languages, unknown_languages)
|
||||
if not upload:
|
||||
input_l = isoLanguages.get_language_codes(get_locale(), input_languages, unknown_languages)
|
||||
else:
|
||||
input_l = isoLanguages.get_valid_language_codes(get_locale(), input_languages, unknown_languages)
|
||||
for l in unknown_languages:
|
||||
log.error('%s is not a valid language', l)
|
||||
flash(_(u"%(langname)s is not a valid language", langname=l), category="warning")
|
||||
@ -375,7 +381,8 @@ def edit_book_publisher(to_save, book):
|
||||
if to_save["publisher"]:
|
||||
publisher = to_save["publisher"].rstrip().strip()
|
||||
if len(book.publishers) == 0 or (len(book.publishers) > 0 and publisher != book.publishers[0].name):
|
||||
changed |= modify_database_object([publisher], book.publishers, db.Publishers, calibre_db.session, 'publisher')
|
||||
changed |= modify_database_object([publisher], book.publishers, db.Publishers, calibre_db.session,
|
||||
'publisher')
|
||||
elif len(book.publishers):
|
||||
changed |= modify_database_object([], book.publishers, db.Publishers, calibre_db.session, 'publisher')
|
||||
return changed
|
||||
@ -459,62 +466,64 @@ def edit_cc_data(book_id, book, to_save):
|
||||
def upload_single_file(request, book, book_id):
|
||||
# Check and handle Uploaded file
|
||||
if 'btn-upload-format' in request.files:
|
||||
requested_file = request.files['btn-upload-format']
|
||||
# check for empty request
|
||||
if requested_file.filename != '':
|
||||
if '.' in requested_file.filename:
|
||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||
if file_ext not in constants.EXTENSIONS_UPLOAD:
|
||||
flash(_("File extension '%(ext)s' is not allowed to be uploaded to this server", ext=file_ext),
|
||||
category="error")
|
||||
requested_file = request.files['btn-upload-format']
|
||||
# check for empty request
|
||||
if requested_file.filename != '':
|
||||
if not current_user.role_upload():
|
||||
abort(403)
|
||||
if '.' in requested_file.filename:
|
||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||
if file_ext not in constants.EXTENSIONS_UPLOAD and '' not in constants.EXTENSIONS_UPLOAD:
|
||||
flash(_("File extension '%(ext)s' is not allowed to be uploaded to this server", ext=file_ext),
|
||||
category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
else:
|
||||
flash(_('File to be uploaded must have an extension'), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
else:
|
||||
flash(_('File to be uploaded must have an extension'), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
|
||||
file_name = book.path.rsplit('/', 1)[-1]
|
||||
filepath = os.path.normpath(os.path.join(config.config_calibre_dir, book.path))
|
||||
saved_filename = os.path.join(filepath, file_name + '.' + file_ext)
|
||||
file_name = book.path.rsplit('/', 1)[-1]
|
||||
filepath = os.path.normpath(os.path.join(config.config_calibre_dir, book.path))
|
||||
saved_filename = os.path.join(filepath, file_name + '.' + file_ext)
|
||||
|
||||
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
||||
if not os.path.exists(filepath):
|
||||
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
||||
if not os.path.exists(filepath):
|
||||
try:
|
||||
os.makedirs(filepath)
|
||||
except OSError:
|
||||
flash(_(u"Failed to create path %(path)s (Permission denied).", path=filepath), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
try:
|
||||
os.makedirs(filepath)
|
||||
requested_file.save(saved_filename)
|
||||
except OSError:
|
||||
flash(_(u"Failed to create path %(path)s (Permission denied).", path=filepath), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
try:
|
||||
requested_file.save(saved_filename)
|
||||
except OSError:
|
||||
flash(_(u"Failed to store file %(file)s.", file=saved_filename), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
|
||||
file_size = os.path.getsize(saved_filename)
|
||||
is_format = calibre_db.get_book_format(book_id, file_ext.upper())
|
||||
|
||||
# Format entry already exists, no need to update the database
|
||||
if is_format:
|
||||
log.warning('Book format %s already existing', file_ext.upper())
|
||||
else:
|
||||
try:
|
||||
db_format = db.Data(book_id, file_ext.upper(), file_size, file_name)
|
||||
calibre_db.session.add(db_format)
|
||||
calibre_db.session.commit()
|
||||
calibre_db.update_title_sort(config)
|
||||
except OperationalError as e:
|
||||
calibre_db.session.rollback()
|
||||
log.error('Database error: %s', e)
|
||||
flash(_(u"Database error: %(error)s.", error=e), category="error")
|
||||
flash(_(u"Failed to store file %(file)s.", file=saved_filename), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
|
||||
# Queue uploader info
|
||||
uploadText=_(u"File format %(ext)s added to %(book)s", ext=file_ext.upper(), book=book.title)
|
||||
worker.add_upload(current_user.nickname,
|
||||
"<a href=\"" + url_for('web.show_book', book_id=book.id) + "\">" + uploadText + "</a>")
|
||||
file_size = os.path.getsize(saved_filename)
|
||||
is_format = calibre_db.get_book_format(book_id, file_ext.upper())
|
||||
|
||||
return uploader.process(
|
||||
saved_filename, *os.path.splitext(requested_file.filename),
|
||||
rarExecutable=config.config_rarfile_location)
|
||||
# Format entry already exists, no need to update the database
|
||||
if is_format:
|
||||
log.warning('Book format %s already existing', file_ext.upper())
|
||||
else:
|
||||
try:
|
||||
db_format = db.Data(book_id, file_ext.upper(), file_size, file_name)
|
||||
calibre_db.session.add(db_format)
|
||||
calibre_db.session.commit()
|
||||
calibre_db.update_title_sort(config)
|
||||
except OperationalError as e:
|
||||
calibre_db.session.rollback()
|
||||
log.error('Database error: %s', e)
|
||||
flash(_(u"Database error: %(error)s.", error=e), category="error")
|
||||
return redirect(url_for('web.show_book', book_id=book.id))
|
||||
|
||||
# Queue uploader info
|
||||
uploadText=_(u"File format %(ext)s added to %(book)s", ext=file_ext.upper(), book=book.title)
|
||||
worker.add_upload(current_user.nickname,
|
||||
"<a href=\"" + url_for('web.show_book', book_id=book.id) + "\">" + uploadText + "</a>")
|
||||
|
||||
return uploader.process(
|
||||
saved_filename, *os.path.splitext(requested_file.filename),
|
||||
rarExecutable=config.config_rarfile_location)
|
||||
|
||||
|
||||
def upload_cover(request, book):
|
||||
@ -522,6 +531,8 @@ def upload_cover(request, book):
|
||||
requested_file = request.files['btn-upload-cover']
|
||||
# check for empty request
|
||||
if requested_file.filename != '':
|
||||
if not current_user.role_upload():
|
||||
abort(403)
|
||||
ret, message = helper.save_cover(requested_file, book.path)
|
||||
if ret is True:
|
||||
return True
|
||||
@ -601,13 +612,16 @@ def edit_book(book_id):
|
||||
error = helper.update_dir_stucture(edited_books_id, config.config_calibre_dir, input_authors[0])
|
||||
|
||||
if not error:
|
||||
if to_save["cover_url"]:
|
||||
result, error = helper.save_cover_from_url(to_save["cover_url"], book.path)
|
||||
if result is True:
|
||||
book.has_cover = 1
|
||||
modif_date = True
|
||||
else:
|
||||
flash(error, category="error")
|
||||
if "cover_url" in to_save:
|
||||
if to_save["cover_url"]:
|
||||
if not current_user.role_upload():
|
||||
return "", (403)
|
||||
result, error = helper.save_cover_from_url(to_save["cover_url"], book.path)
|
||||
if result is True:
|
||||
book.has_cover = 1
|
||||
modif_date = True
|
||||
else:
|
||||
flash(error, category="error")
|
||||
|
||||
# Add default series_index to book
|
||||
modif_date |= edit_book_series_index(to_save["series_index"], book)
|
||||
@ -615,10 +629,12 @@ def edit_book(book_id):
|
||||
# Handle book comments/description
|
||||
modif_date |= edit_book_comments(to_save["description"], book)
|
||||
|
||||
# Handle identifiers
|
||||
# Handle identifiers
|
||||
input_identifiers = identifier_list(to_save, book)
|
||||
modif_date |= modify_identifiers(input_identifiers, book.identifiers, calibre_db.session)
|
||||
|
||||
modification, warning = modify_identifiers(input_identifiers, book.identifiers, calibre_db.session)
|
||||
if warning:
|
||||
flash(_("Identifiers are not Case Sensitive, Overwriting Old Identifier"), category="warning")
|
||||
modif_date |= modification
|
||||
# Handle book tags
|
||||
modif_date |= edit_book_tags(to_save['tags'], book)
|
||||
|
||||
@ -647,6 +663,7 @@ def edit_book(book_id):
|
||||
|
||||
if modif_date:
|
||||
book.last_modified = datetime.utcnow()
|
||||
calibre_db.session.merge(book)
|
||||
calibre_db.session.commit()
|
||||
if config.config_use_google_drive:
|
||||
gdriveutils.updateGdriveCalibreFromLocal()
|
||||
@ -690,7 +707,7 @@ def identifier_list(to_save, book):
|
||||
val_key = id_val_prefix + type_key[len(id_type_prefix):]
|
||||
if val_key not in to_save.keys():
|
||||
continue
|
||||
result.append( db.Identifiers(to_save[val_key], type_value, book.id) )
|
||||
result.append(db.Identifiers(to_save[val_key], type_value, book.id))
|
||||
return result
|
||||
|
||||
@editbook.route("/upload", methods=["GET", "POST"])
|
||||
@ -710,7 +727,7 @@ def upload():
|
||||
# check if file extension is correct
|
||||
if '.' in requested_file.filename:
|
||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||
if file_ext not in constants.EXTENSIONS_UPLOAD:
|
||||
if file_ext not in constants.EXTENSIONS_UPLOAD and '' not in constants.EXTENSIONS_UPLOAD:
|
||||
flash(
|
||||
_("File extension '%(ext)s' is not allowed to be uploaded to this server",
|
||||
ext=file_ext), category="error")
|
||||
@ -833,8 +850,8 @@ def upload():
|
||||
|
||||
# move cover to final directory, including book id
|
||||
if has_cover:
|
||||
new_coverpath = os.path.join(config.config_calibre_dir, db_book.path, "cover.jpg")
|
||||
try:
|
||||
new_coverpath = os.path.join(config.config_calibre_dir, db_book.path, "cover.jpg")
|
||||
copyfile(meta.cover, new_coverpath)
|
||||
os.unlink(meta.cover)
|
||||
except OSError as e:
|
||||
|
17
cps/epub.py
17
cps/epub.py
@ -22,6 +22,7 @@ import zipfile
|
||||
from lxml import etree
|
||||
|
||||
from . import isoLanguages
|
||||
from .helper import split_authors
|
||||
from .constants import BookMeta
|
||||
|
||||
|
||||
@ -64,9 +65,9 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
|
||||
if len(tmp) > 0:
|
||||
if s == 'creator':
|
||||
epub_metadata[s] = ' & '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
||||
epub_metadata[s] = ' & '.join(split_authors(p.xpath('dc:%s/text()' % s, namespaces=ns)))
|
||||
elif s == 'subject':
|
||||
epub_metadata[s] = ', '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
||||
epub_metadata[s] = ', '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
||||
else:
|
||||
epub_metadata[s] = p.xpath('dc:%s/text()' % s, namespaces=ns)[0]
|
||||
else:
|
||||
@ -82,16 +83,8 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||
else:
|
||||
epub_metadata['description'] = ""
|
||||
|
||||
if epub_metadata['language'] == u'Unknown':
|
||||
epub_metadata['language'] = ""
|
||||
else:
|
||||
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
||||
if len(lang) == 2:
|
||||
epub_metadata['language'] = isoLanguages.get(part1=lang).name
|
||||
elif len(lang) == 3:
|
||||
epub_metadata['language'] = isoLanguages.get(part3=lang).name
|
||||
else:
|
||||
epub_metadata['language'] = ""
|
||||
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
||||
epub_metadata['language'] = isoLanguages.get_lang3(lang)
|
||||
|
||||
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
|
||||
if len(series) > 0:
|
||||
|
@ -34,18 +34,17 @@ from flask import Blueprint, flash, request, redirect, url_for, abort
|
||||
from flask_babel import gettext as _
|
||||
from flask_login import login_required
|
||||
|
||||
try:
|
||||
from googleapiclient.errors import HttpError
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
from . import logger, gdriveutils, config, ub, calibre_db
|
||||
from .web import admin_required
|
||||
|
||||
|
||||
gdrive = Blueprint('gdrive', __name__)
|
||||
log = logger.create()
|
||||
|
||||
try:
|
||||
from googleapiclient.errors import HttpError
|
||||
except ImportError as err:
|
||||
log.debug("Cannot import googleapiclient, using GDrive will not work: %s", err)
|
||||
|
||||
current_milli_time = lambda: int(round(time() * 1000))
|
||||
|
||||
gdrive_watch_callback_token = 'target=calibreweb-watch_files'
|
||||
@ -73,7 +72,7 @@ def google_drive_callback():
|
||||
credentials = gdriveutils.Gauth.Instance().auth.flow.step2_exchange(auth_code)
|
||||
with open(gdriveutils.CREDENTIALS, 'w') as f:
|
||||
f.write(credentials.to_json())
|
||||
except ValueError as error:
|
||||
except (ValueError, AttributeError) as error:
|
||||
log.error(error)
|
||||
return redirect(url_for('admin.configuration'))
|
||||
|
||||
@ -94,8 +93,7 @@ def watch_gdrive():
|
||||
try:
|
||||
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
|
||||
'web_hook', address, gdrive_watch_callback_token, current_milli_time() + 604800*1000)
|
||||
config.config_google_drive_watch_changes_response = json.dumps(result)
|
||||
# after save(), config_google_drive_watch_changes_response will be a json object, not string
|
||||
config.config_google_drive_watch_changes_response = result
|
||||
config.save()
|
||||
except HttpError as e:
|
||||
reason=json.loads(e.content)['error']['errors'][0]
|
||||
@ -118,7 +116,7 @@ def revoke_watch_gdrive():
|
||||
last_watch_response['resourceId'])
|
||||
except HttpError:
|
||||
pass
|
||||
config.config_google_drive_watch_changes_response = None
|
||||
config.config_google_drive_watch_changes_response = {}
|
||||
config.save()
|
||||
return redirect(url_for('admin.configuration'))
|
||||
|
||||
@ -155,7 +153,7 @@ def on_received_watch_confirmation():
|
||||
log.info('Setting up new DB')
|
||||
# prevent error on windows, as os.rename does on exisiting files
|
||||
move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath)
|
||||
calibre_db.setup_db(config, ub.app_DB_path)
|
||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||
except Exception as e:
|
||||
log.exception(e)
|
||||
updateMetaData()
|
||||
|
@ -27,14 +27,18 @@ from sqlalchemy import Column, UniqueConstraint
|
||||
from sqlalchemy import String, Integer
|
||||
from sqlalchemy.orm import sessionmaker, scoped_session
|
||||
from sqlalchemy.ext.declarative import declarative_base
|
||||
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||
|
||||
try:
|
||||
from pydrive.auth import GoogleAuth
|
||||
from pydrive.drive import GoogleDrive
|
||||
from pydrive.auth import RefreshError
|
||||
from apiclient import errors
|
||||
from httplib2 import ServerNotFoundError
|
||||
gdrive_support = True
|
||||
except ImportError:
|
||||
importError = None
|
||||
except ImportError as err:
|
||||
importError = err
|
||||
gdrive_support = False
|
||||
|
||||
from . import logger, cli, config
|
||||
@ -50,6 +54,8 @@ if gdrive_support:
|
||||
logger.get('googleapiclient.discovery_cache').setLevel(logger.logging.ERROR)
|
||||
if not logger.is_debug_enabled():
|
||||
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
|
||||
else:
|
||||
log.debug("Cannot import pydrive,httplib2, using gdrive will not work: %s", importError)
|
||||
|
||||
|
||||
class Singleton:
|
||||
@ -97,7 +103,11 @@ class Singleton:
|
||||
@Singleton
|
||||
class Gauth:
|
||||
def __init__(self):
|
||||
self.auth = GoogleAuth(settings_file=SETTINGS_YAML)
|
||||
try:
|
||||
self.auth = GoogleAuth(settings_file=SETTINGS_YAML)
|
||||
except NameError as error:
|
||||
log.error(error)
|
||||
self.auth = None
|
||||
|
||||
|
||||
@Singleton
|
||||
@ -192,14 +202,18 @@ def getDrive(drive=None, gauth=None):
|
||||
return drive
|
||||
|
||||
def listRootFolders():
|
||||
drive = getDrive(Gdrive.Instance().drive)
|
||||
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
||||
fileList = drive.ListFile({'q': folder}).GetList()
|
||||
try:
|
||||
drive = getDrive(Gdrive.Instance().drive)
|
||||
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
||||
fileList = drive.ListFile({'q': folder}).GetList()
|
||||
except ServerNotFoundError as e:
|
||||
log.info("GDrive Error %s" % e)
|
||||
fileList = []
|
||||
return fileList
|
||||
|
||||
|
||||
def getEbooksFolder(drive):
|
||||
return getFolderInFolder('root',config.config_google_drive_folder,drive)
|
||||
return getFolderInFolder('root', config.config_google_drive_folder, drive)
|
||||
|
||||
|
||||
def getFolderInFolder(parentId, folderName, drive):
|
||||
@ -229,7 +243,7 @@ def getEbooksFolderId(drive=None):
|
||||
gDriveId.path = '/'
|
||||
session.merge(gDriveId)
|
||||
session.commit()
|
||||
return
|
||||
return gDriveId.gdrive_id
|
||||
|
||||
|
||||
def getFile(pathId, fileName, drive):
|
||||
@ -474,8 +488,13 @@ def getChangeById (drive, change_id):
|
||||
|
||||
# Deletes the local hashes database to force search for new folder names
|
||||
def deleteDatabaseOnChange():
|
||||
session.query(GdriveId).delete()
|
||||
session.commit()
|
||||
try:
|
||||
session.query(GdriveId).delete()
|
||||
session.commit()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
session.rollback()
|
||||
log.info(u"GDrive DB is not Writeable")
|
||||
|
||||
|
||||
def updateGdriveCalibreFromLocal():
|
||||
copyToDrive(Gdrive.Instance().drive, config.config_calibre_dir, False, True)
|
||||
@ -583,8 +602,12 @@ def get_error_text(client_secrets=None):
|
||||
if not os.path.isfile(CLIENT_SECRETS):
|
||||
return 'client_secrets.json is missing or not readable'
|
||||
|
||||
with open(CLIENT_SECRETS, 'r') as settings:
|
||||
filedata = json.load(settings)
|
||||
try:
|
||||
with open(CLIENT_SECRETS, 'r') as settings:
|
||||
filedata = json.load(settings)
|
||||
except PermissionError:
|
||||
return 'client_secrets.json is missing or not readable'
|
||||
|
||||
if 'web' not in filedata:
|
||||
return 'client_secrets.json is not configured for web application'
|
||||
if 'redirect_uris' not in filedata['web']:
|
||||
|
105
cps/helper.py
105
cps/helper.py
@ -21,7 +21,6 @@ from __future__ import division, print_function, unicode_literals
|
||||
import sys
|
||||
import os
|
||||
import io
|
||||
import json
|
||||
import mimetypes
|
||||
import re
|
||||
import shutil
|
||||
@ -36,7 +35,7 @@ from babel.units import format_unit
|
||||
from flask import send_from_directory, make_response, redirect, abort
|
||||
from flask_babel import gettext as _
|
||||
from flask_login import current_user
|
||||
from sqlalchemy.sql.expression import true, false, and_, or_, text, func
|
||||
from sqlalchemy.sql.expression import true, false, and_, text, func
|
||||
from werkzeug.datastructures import Headers
|
||||
from werkzeug.security import generate_password_hash
|
||||
from . import calibre_db
|
||||
@ -59,10 +58,9 @@ try:
|
||||
except ImportError:
|
||||
use_PIL = False
|
||||
|
||||
from . import logger, config, get_locale, db, ub, isoLanguages, worker
|
||||
from . import logger, config, get_locale, db, ub, worker
|
||||
from . import gdriveutils as gd
|
||||
from .constants import STATIC_DIR as _STATIC_DIR
|
||||
from .pagination import Pagination
|
||||
from .subproc_wrapper import process_wait
|
||||
from .worker import STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
||||
from .worker import TASK_EMAIL, TASK_CONVERT, TASK_UPLOAD, TASK_CONVERT_ANY
|
||||
@ -100,10 +98,10 @@ def convert_book_format(book_id, calibrepath, old_book_format, new_book_format,
|
||||
# text = _(u"%(format)s: %(book)s", format=new_book_format, book=book.title)
|
||||
else:
|
||||
settings = dict()
|
||||
text = (u"%s -> %s: %s" % (old_book_format, new_book_format, book.title))
|
||||
txt = (u"%s -> %s: %s" % (old_book_format, new_book_format, book.title))
|
||||
settings['old_book_format'] = old_book_format
|
||||
settings['new_book_format'] = new_book_format
|
||||
worker.add_convert(file_path, book.id, user_id, text, settings, kindle_mail)
|
||||
worker.add_convert(file_path, book.id, user_id, txt, settings, kindle_mail)
|
||||
return None
|
||||
else:
|
||||
error_message = _(u"%(format)s not found: %(fn)s",
|
||||
@ -239,22 +237,22 @@ def get_valid_filename(value, replace_whitespace=True):
|
||||
value = value[:-1]+u'_'
|
||||
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
||||
if use_unidecode:
|
||||
value = (unidecode.unidecode(value)).strip()
|
||||
value = (unidecode.unidecode(value))
|
||||
else:
|
||||
value = value.replace(u'§', u'SS')
|
||||
value = value.replace(u'ß', u'ss')
|
||||
value = unicodedata.normalize('NFKD', value)
|
||||
re_slugify = re.compile(r'[\W\s-]', re.UNICODE)
|
||||
if isinstance(value, str): # Python3 str, Python2 unicode
|
||||
value = re_slugify.sub('', value).strip()
|
||||
value = re_slugify.sub('', value)
|
||||
else:
|
||||
value = unicode(re_slugify.sub('', value).strip())
|
||||
value = unicode(re_slugify.sub('', value))
|
||||
if replace_whitespace:
|
||||
# *+:\"/<>? are replaced by _
|
||||
value = re.sub(r'[\*\+:\\\"/<>\?]+', u'_', value, flags=re.U)
|
||||
value = re.sub(r'[*+:\\\"/<>?]+', u'_', value, flags=re.U)
|
||||
# pipe has to be replaced with comma
|
||||
value = re.sub(r'[\|]+', u',', value, flags=re.U)
|
||||
value = value[:128]
|
||||
value = re.sub(r'[|]+', u',', value, flags=re.U)
|
||||
value = value[:128].strip()
|
||||
if not value:
|
||||
raise ValueError("Filename cannot be empty")
|
||||
if sys.version_info.major == 3:
|
||||
@ -263,6 +261,22 @@ def get_valid_filename(value, replace_whitespace=True):
|
||||
return value.decode('utf-8')
|
||||
|
||||
|
||||
def split_authors(values):
|
||||
authors_list = []
|
||||
for value in values:
|
||||
authors = re.split('[&;]', value)
|
||||
for author in authors:
|
||||
commas = author.count(',')
|
||||
if commas == 1:
|
||||
author_split = author.split(',')
|
||||
authors_list.append(author_split[1].strip() + ' ' + author_split[0].strip())
|
||||
elif commas > 1:
|
||||
authors_list.extend([x.strip() for x in author.split(',')])
|
||||
else:
|
||||
authors_list.append(author.strip())
|
||||
return authors_list
|
||||
|
||||
|
||||
def get_sorted_author(value):
|
||||
try:
|
||||
if ',' not in value:
|
||||
@ -270,7 +284,10 @@ def get_sorted_author(value):
|
||||
combined = "(" + ")|(".join(regexes) + ")"
|
||||
value = value.split(" ")
|
||||
if re.match(combined, value[-1].upper()):
|
||||
value2 = value[-2] + ", " + " ".join(value[:-2]) + " " + value[-1]
|
||||
if len(value) > 1:
|
||||
value2 = value[-2] + ", " + " ".join(value[:-2]) + " " + value[-1]
|
||||
else:
|
||||
value2 = value[0]
|
||||
elif len(value) == 1:
|
||||
value2 = value[0]
|
||||
else:
|
||||
@ -279,7 +296,10 @@ def get_sorted_author(value):
|
||||
value2 = value
|
||||
except Exception as ex:
|
||||
log.error("Sorting author %s failed: %s", value, ex)
|
||||
value2 = value
|
||||
if isinstance(list, value2):
|
||||
value2 = value[0]
|
||||
else:
|
||||
value2 = value
|
||||
return value2
|
||||
|
||||
|
||||
@ -295,15 +315,16 @@ def delete_book_file(book, calibrepath, book_format=None):
|
||||
return True, None
|
||||
else:
|
||||
if os.path.isdir(path):
|
||||
if len(next(os.walk(path))[1]):
|
||||
log.error("Deleting book %s failed, path has subfolders: %s", book.id, book.path)
|
||||
return False , _("Deleting book %(id)s failed, path has subfolders: %(path)s",
|
||||
id=book.id,
|
||||
path=book.path)
|
||||
try:
|
||||
for root, __, files in os.walk(path):
|
||||
for root, folders, files in os.walk(path):
|
||||
for f in files:
|
||||
os.unlink(os.path.join(root, f))
|
||||
if len(folders):
|
||||
log.warning("Deleting book {} failed, path {} has subfolders: {}".format(book.id,
|
||||
book.path, folders))
|
||||
return True, _("Deleting bookfolder for book %(id)s failed, path has subfolders: %(path)s",
|
||||
id=book.id,
|
||||
path=book.path)
|
||||
shutil.rmtree(path)
|
||||
except (IOError, OSError) as e:
|
||||
log.error("Deleting book %s failed: %s", book.id, e)
|
||||
@ -318,8 +339,8 @@ def delete_book_file(book, calibrepath, book_format=None):
|
||||
else:
|
||||
log.error("Deleting book %s failed, book path not valid: %s", book.id, book.path)
|
||||
return True, _("Deleting book %(id)s, book path not valid: %(path)s",
|
||||
id=book.id,
|
||||
path=book.path)
|
||||
id=book.id,
|
||||
path=book.path)
|
||||
|
||||
|
||||
def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||
@ -339,13 +360,13 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||
new_title_path = os.path.join(os.path.dirname(path), new_titledir)
|
||||
try:
|
||||
if not os.path.exists(new_title_path):
|
||||
os.renames(path, new_title_path)
|
||||
os.renames(os.path.normcase(path), os.path.normcase(new_title_path))
|
||||
else:
|
||||
log.info("Copying title: %s into existing: %s", path, new_title_path)
|
||||
for dir_name, __, file_list in os.walk(path):
|
||||
for file in file_list:
|
||||
os.renames(os.path.join(dir_name, file),
|
||||
os.path.join(new_title_path + dir_name[len(path):], file))
|
||||
os.renames(os.path.normcase(os.path.join(dir_name, file)),
|
||||
os.path.normcase(os.path.join(new_title_path + dir_name[len(path):], file)))
|
||||
path = new_title_path
|
||||
localbook.path = localbook.path.split('/')[0] + '/' + new_titledir
|
||||
except OSError as ex:
|
||||
@ -356,7 +377,7 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||
if authordir != new_authordir:
|
||||
new_author_path = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
||||
try:
|
||||
os.renames(path, new_author_path)
|
||||
os.renames(os.path.normcase(path), os.path.normcase(new_author_path))
|
||||
localbook.path = new_authordir + '/' + localbook.path.split('/')[1]
|
||||
except OSError as ex:
|
||||
log.error("Rename author from: %s to %s: %s", path, new_author_path, ex)
|
||||
@ -365,12 +386,14 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||
src=path, dest=new_author_path, error=str(ex))
|
||||
# Rename all files from old names to new names
|
||||
if authordir != new_authordir or titledir != new_titledir:
|
||||
new_name = ""
|
||||
try:
|
||||
new_name = get_valid_filename(localbook.title) + ' - ' + get_valid_filename(new_authordir)
|
||||
path_name = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
||||
for file_format in localbook.data:
|
||||
os.renames(os.path.join(path_name, file_format.name + '.' + file_format.format.lower()),
|
||||
os.path.join(path_name, new_name + '.' + file_format.format.lower()))
|
||||
os.renames(os.path.normcase(
|
||||
os.path.join(path_name, file_format.name + '.' + file_format.format.lower())),
|
||||
os.path.normcase(os.path.join(path_name, new_name + '.' + file_format.format.lower())))
|
||||
file_format.name = new_name
|
||||
except OSError as ex:
|
||||
log.error("Rename file in path %s to %s: %s", path, new_name, ex)
|
||||
@ -466,17 +489,20 @@ def reset_password(user_id):
|
||||
def generate_random_password():
|
||||
s = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
||||
passlen = 8
|
||||
return "".join(s[c % len(s)] for c in os.urandom(passlen))
|
||||
if sys.version_info < (3, 0):
|
||||
return "".join(s[ord(c) % len(s)] for c in os.urandom(passlen))
|
||||
else:
|
||||
return "".join(s[c % len(s)] for c in os.urandom(passlen))
|
||||
|
||||
|
||||
def uniq(input):
|
||||
output = []
|
||||
for x in input:
|
||||
if x not in output:
|
||||
output.append(x)
|
||||
return output
|
||||
def uniq(inpt):
|
||||
output = []
|
||||
for x in inpt:
|
||||
if x not in output:
|
||||
output.append(x)
|
||||
return output
|
||||
|
||||
################################## External interface
|
||||
# ################################# External interface #################################
|
||||
|
||||
|
||||
def update_dir_stucture(book_id, calibrepath, first_author=None):
|
||||
@ -553,7 +579,6 @@ def save_cover_from_url(url, book_path):
|
||||
return False, _("Cover Format Error")
|
||||
|
||||
|
||||
|
||||
def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||
if hasattr(img, '_content'):
|
||||
f = open(os.path.join(filepath, saved_filename), "wb")
|
||||
@ -612,7 +637,6 @@ def save_cover(img, book_path):
|
||||
return save_cover_from_filestorage(os.path.join(config.config_calibre_dir, book_path), "cover.jpg", img)
|
||||
|
||||
|
||||
|
||||
def do_download_file(book, book_format, client, data, headers):
|
||||
if config.config_use_google_drive:
|
||||
startTime = time.time()
|
||||
@ -713,7 +737,7 @@ def render_task_status(tasklist):
|
||||
task['runtime'] = format_runtime(task['formRuntime'])
|
||||
|
||||
# localize the task status
|
||||
if isinstance( task['stat'], int):
|
||||
if isinstance(task['stat'], int):
|
||||
if task['stat'] == STAT_WAITING:
|
||||
task['status'] = _(u'Waiting')
|
||||
elif task['stat'] == STAT_FAIL:
|
||||
@ -726,7 +750,7 @@ def render_task_status(tasklist):
|
||||
task['status'] = _(u'Unknown Status')
|
||||
|
||||
# localize the task type
|
||||
if isinstance( task['taskType'], int):
|
||||
if isinstance(task['taskType'], int):
|
||||
if task['taskType'] == TASK_EMAIL:
|
||||
task['taskMessage'] = _(u'E-mail: ') + task['taskMess']
|
||||
elif task['taskType'] == TASK_CONVERT:
|
||||
@ -782,6 +806,7 @@ def get_cc_columns(filter_config_custom_read=False):
|
||||
|
||||
return cc
|
||||
|
||||
|
||||
def get_download_link(book_id, book_format, client):
|
||||
book_format = book_format.split(".")[0]
|
||||
book = calibre_db.get_filtered_book(book_id)
|
||||
|
@ -66,3 +66,27 @@ def get_language_codes(locale, language_names, remainder=None):
|
||||
if remainder is not None:
|
||||
remainder.extend(language_names)
|
||||
return languages
|
||||
|
||||
def get_valid_language_codes(locale, language_names, remainder=None):
|
||||
languages = list()
|
||||
if "" in language_names:
|
||||
language_names.remove("")
|
||||
for k, v in get_language_names(locale).items():
|
||||
if k in language_names:
|
||||
languages.append(k)
|
||||
language_names.remove(k)
|
||||
if remainder is not None and len(language_names):
|
||||
remainder.extend(language_names)
|
||||
return languages
|
||||
|
||||
def get_lang3(lang):
|
||||
try:
|
||||
if len(lang) == 2:
|
||||
ret_value = get(part1=lang).part3
|
||||
elif len(lang) == 3:
|
||||
ret_value = lang
|
||||
else:
|
||||
ret_value = ""
|
||||
except KeyError:
|
||||
ret_value = lang
|
||||
return ret_value
|
||||
|
@ -111,3 +111,10 @@ def timestamptodate(date, fmt=None):
|
||||
@jinjia.app_template_filter('yesno')
|
||||
def yesno(value, yes, no):
|
||||
return yes if value else no
|
||||
|
||||
@jinjia.app_template_filter('formatfloat')
|
||||
def formatfloat(value, decimals=1):
|
||||
formatedstring = '%d' % value
|
||||
if (value % 1) != 0:
|
||||
formatedstring = ('%s.%d' % (formatedstring, (value % 1) * 10**decimals)).rstrip('0')
|
||||
return formatedstring
|
||||
|
48
cps/kobo.py
48
cps/kobo.py
@ -19,8 +19,6 @@
|
||||
|
||||
import base64
|
||||
import datetime
|
||||
import itertools
|
||||
import json
|
||||
import sys
|
||||
import os
|
||||
import uuid
|
||||
@ -131,7 +129,7 @@ def HandleSyncRequest():
|
||||
sync_token = SyncToken.SyncToken.from_headers(request.headers)
|
||||
log.info("Kobo library sync request received.")
|
||||
if not current_app.wsgi_app.is_proxied:
|
||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
||||
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||
|
||||
# TODO: Limit the number of books return per sync call, and rely on the sync-continuatation header
|
||||
# instead so that the device triggers another sync.
|
||||
@ -254,7 +252,7 @@ def generate_sync_response(sync_token, sync_results):
|
||||
@download_required
|
||||
def HandleMetadataRequest(book_uuid):
|
||||
if not current_app.wsgi_app.is_proxied:
|
||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
||||
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||
log.info("Kobo library metadata request received for book %s" % book_uuid)
|
||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||
if not book or not book.data:
|
||||
@ -267,14 +265,15 @@ def HandleMetadataRequest(book_uuid):
|
||||
|
||||
def get_download_url_for_book(book, book_format):
|
||||
if not current_app.wsgi_app.is_proxied:
|
||||
if ':' in request.host and not request.host.endswith(']') :
|
||||
if ':' in request.host and not request.host.endswith(']'):
|
||||
host = "".join(request.host.split(':')[:-1])
|
||||
else:
|
||||
host = request.host
|
||||
|
||||
return "{url_scheme}://{url_base}:{url_port}/download/{book_id}/{book_format}".format(
|
||||
url_scheme=request.scheme,
|
||||
url_base=host,
|
||||
url_port=config.config_port,
|
||||
url_port=config.config_external_port,
|
||||
book_id=book.id,
|
||||
book_format=book_format.lower()
|
||||
)
|
||||
@ -317,8 +316,15 @@ def get_description(book):
|
||||
# TODO handle multiple authors
|
||||
def get_author(book):
|
||||
if not book.authors:
|
||||
return None
|
||||
return book.authors[0].name
|
||||
return {"Contributors": None}
|
||||
if len(book.authors) > 1:
|
||||
author_list = []
|
||||
autor_roles = []
|
||||
for author in book.authors:
|
||||
autor_roles.append({"Name":author.name, "Role":"Author"})
|
||||
author_list.append(author.name)
|
||||
return {"ContributorRoles": autor_roles, "Contributors":author_list}
|
||||
return {"ContributorRoles": [{"Name":book.authors[0].name, "Role":"Author"}], "Contributors": book.authors[0].name}
|
||||
|
||||
|
||||
def get_publisher(book):
|
||||
@ -357,7 +363,7 @@ def get_metadata(book):
|
||||
book_uuid = book.uuid
|
||||
metadata = {
|
||||
"Categories": ["00000000-0000-0000-0000-000000000001",],
|
||||
"Contributors": get_author(book),
|
||||
# "Contributors": get_author(book),
|
||||
"CoverImageId": book_uuid,
|
||||
"CrossRevisionId": book_uuid,
|
||||
"CurrentDisplayPrice": {"CurrencyCode": "USD", "TotalAmount": 0},
|
||||
@ -381,6 +387,7 @@ def get_metadata(book):
|
||||
"Title": book.title,
|
||||
"WorkId": book_uuid,
|
||||
}
|
||||
metadata.update(get_author(book))
|
||||
|
||||
if get_series(book):
|
||||
if sys.version_info < (3, 0):
|
||||
@ -399,7 +406,7 @@ def get_metadata(book):
|
||||
|
||||
|
||||
@kobo.route("/v1/library/tags", methods=["POST", "DELETE"])
|
||||
@login_required
|
||||
@requires_kobo_auth
|
||||
# Creates a Shelf with the given items, and returns the shelf's uuid.
|
||||
def HandleTagCreate():
|
||||
# catch delete requests, otherwise the are handeld in the book delete handler
|
||||
@ -434,6 +441,7 @@ def HandleTagCreate():
|
||||
|
||||
|
||||
@kobo.route("/v1/library/tags/<tag_id>", methods=["DELETE", "PUT"])
|
||||
@requires_kobo_auth
|
||||
def HandleTagUpdate(tag_id):
|
||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.uuid == tag_id,
|
||||
ub.Shelf.user_id == current_user.id).one_or_none()
|
||||
@ -488,7 +496,7 @@ def add_items_to_shelf(items, shelf):
|
||||
|
||||
|
||||
@kobo.route("/v1/library/tags/<tag_id>/items", methods=["POST"])
|
||||
@login_required
|
||||
@requires_kobo_auth
|
||||
def HandleTagAddItem(tag_id):
|
||||
items = None
|
||||
try:
|
||||
@ -518,7 +526,7 @@ def HandleTagAddItem(tag_id):
|
||||
|
||||
|
||||
@kobo.route("/v1/library/tags/<tag_id>/items/delete", methods=["POST"])
|
||||
@login_required
|
||||
@requires_kobo_auth
|
||||
def HandleTagRemoveItem(tag_id):
|
||||
items = None
|
||||
try:
|
||||
@ -627,7 +635,7 @@ def create_kobo_tag(shelf):
|
||||
|
||||
|
||||
@kobo.route("/v1/library/<book_uuid>/state", methods=["GET", "PUT"])
|
||||
@login_required
|
||||
@requires_kobo_auth
|
||||
def HandleStateRequest(book_uuid):
|
||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||
if not book or not book.data:
|
||||
@ -801,7 +809,7 @@ def TopLevelEndpoint():
|
||||
|
||||
|
||||
@kobo.route("/v1/library/<book_uuid>", methods=["DELETE"])
|
||||
@login_required
|
||||
@requires_kobo_auth
|
||||
def HandleBookDeletionRequest(book_uuid):
|
||||
log.info("Kobo book deletion request received for book %s" % book_uuid)
|
||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||
@ -917,7 +925,7 @@ def HandleInitRequest():
|
||||
kobo_resources = NATIVE_KOBO_RESOURCES()
|
||||
|
||||
if not current_app.wsgi_app.is_proxied:
|
||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
||||
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||
if ':' in request.host and not request.host.endswith(']'):
|
||||
host = "".join(request.host.split(':')[:-1])
|
||||
else:
|
||||
@ -925,8 +933,9 @@ def HandleInitRequest():
|
||||
calibre_web_url = "{url_scheme}://{url_base}:{url_port}".format(
|
||||
url_scheme=request.scheme,
|
||||
url_base=host,
|
||||
url_port=config.config_port
|
||||
url_port=config.config_external_port
|
||||
)
|
||||
log.debug('Kobo: Received unproxied request, changed request url to %s', calibre_web_url)
|
||||
kobo_resources["image_host"] = calibre_web_url
|
||||
kobo_resources["image_url_quality_template"] = unquote(calibre_web_url +
|
||||
url_for("kobo.HandleCoverImageRequest",
|
||||
@ -935,16 +944,14 @@ def HandleInitRequest():
|
||||
width="{width}",
|
||||
height="{height}",
|
||||
Quality='{Quality}',
|
||||
isGreyscale='isGreyscale'
|
||||
))
|
||||
isGreyscale='isGreyscale'))
|
||||
kobo_resources["image_url_template"] = unquote(calibre_web_url +
|
||||
url_for("kobo.HandleCoverImageRequest",
|
||||
auth_token=kobo_auth.get_auth_token(),
|
||||
book_uuid="{ImageId}",
|
||||
width="{width}",
|
||||
height="{height}",
|
||||
isGreyscale='false'
|
||||
))
|
||||
isGreyscale='false'))
|
||||
else:
|
||||
kobo_resources["image_host"] = url_for("web.index", _external=True).strip("/")
|
||||
kobo_resources["image_url_quality_template"] = unquote(url_for("kobo.HandleCoverImageRequest",
|
||||
@ -963,7 +970,6 @@ def HandleInitRequest():
|
||||
isGreyscale='false',
|
||||
_external=True))
|
||||
|
||||
|
||||
response = make_response(jsonify({"Resources": kobo_resources}))
|
||||
response.headers["x-kobo-apitoken"] = "e30="
|
||||
|
||||
|
@ -126,11 +126,11 @@ def setup(log_file, log_level=None):
|
||||
file_handler.baseFilename = log_file
|
||||
else:
|
||||
try:
|
||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2)
|
||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||
except IOError:
|
||||
if log_file == DEFAULT_LOG_FILE:
|
||||
raise
|
||||
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2)
|
||||
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||
log_file = ""
|
||||
file_handler.setFormatter(FORMATTER)
|
||||
|
||||
@ -152,11 +152,11 @@ def create_access_log(log_file, log_name, formatter):
|
||||
access_log.propagate = False
|
||||
access_log.setLevel(logging.INFO)
|
||||
try:
|
||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2)
|
||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||
except IOError:
|
||||
if log_file == DEFAULT_ACCESS_LOG:
|
||||
raise
|
||||
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2)
|
||||
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||
log_file = ""
|
||||
|
||||
file_handler.setFormatter(formatter)
|
||||
|
@ -122,10 +122,10 @@ if ub.oauth_support:
|
||||
ele2 = dict(provider_name='google',
|
||||
id=oauth_ids[1].id,
|
||||
active=oauth_ids[1].active,
|
||||
scope=["https://www.googleapis.com/auth/plus.me", "https://www.googleapis.com/auth/userinfo.email"],
|
||||
scope=["https://www.googleapis.com/auth/userinfo.email"],
|
||||
oauth_client_id=oauth_ids[1].oauth_client_id,
|
||||
oauth_client_secret=oauth_ids[1].oauth_client_secret,
|
||||
obtain_link='https://github.com/settings/developers')
|
||||
obtain_link='https://console.developers.google.com/apis/credentials')
|
||||
oauthblueprints.append(ele1)
|
||||
oauthblueprints.append(ele2)
|
||||
|
||||
@ -287,7 +287,7 @@ if ub.oauth_support:
|
||||
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
||||
except NoResultFound:
|
||||
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
||||
flash(_(u"Not Linked to %(oauth)s.", oauth=oauth_check[provider]), category="error")
|
||||
flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
|
||||
return redirect(url_for('web.profile'))
|
||||
|
||||
|
||||
@ -355,4 +355,4 @@ if ub.oauth_support:
|
||||
@oauth.route('/unlink/google', methods=["GET"])
|
||||
@login_required
|
||||
def google_login_unlink():
|
||||
return unlink_oauth(oauthblueprints[1]['blueprint'].name)
|
||||
return unlink_oauth(oauthblueprints[1]['id'])
|
||||
|
@ -77,6 +77,7 @@ class ReverseProxied(object):
|
||||
servr = environ.get('HTTP_X_FORWARDED_HOST', '')
|
||||
if servr:
|
||||
environ['HTTP_HOST'] = servr
|
||||
self.proxied = True
|
||||
return self.app(environ, start_response)
|
||||
|
||||
@property
|
||||
|
@ -27,6 +27,8 @@ try:
|
||||
from gevent.pywsgi import WSGIServer
|
||||
from gevent.pool import Pool
|
||||
from gevent import __version__ as _version
|
||||
from greenlet import GreenletExit
|
||||
import ssl
|
||||
VERSION = 'Gevent ' + _version
|
||||
_GEVENT = True
|
||||
except ImportError:
|
||||
@ -143,6 +145,16 @@ class WebServer(object):
|
||||
output = _readable_listen_address(self.listen_address, self.listen_port)
|
||||
log.info('Starting Gevent server on %s', output)
|
||||
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args)
|
||||
if ssl_args:
|
||||
wrap_socket = self.wsgiserver.wrap_socket
|
||||
def my_wrap_socket(*args, **kwargs):
|
||||
try:
|
||||
return wrap_socket(*args, **kwargs)
|
||||
except (ssl.SSLError) as ex:
|
||||
log.warning('Gevent SSL Error: %s', ex)
|
||||
raise GreenletExit
|
||||
|
||||
self.wsgiserver.wrap_socket = my_wrap_socket
|
||||
self.wsgiserver.serve_forever()
|
||||
finally:
|
||||
if self.unix_socket_file:
|
||||
@ -194,7 +206,7 @@ class WebServer(object):
|
||||
os.execv(sys.executable, arguments)
|
||||
return True
|
||||
|
||||
def _killServer(self, ignored_signum, ignored_frame):
|
||||
def _killServer(self, __, ___):
|
||||
self.stop()
|
||||
|
||||
def stop(self, restart=False):
|
||||
|
@ -27,7 +27,10 @@ except ImportError:
|
||||
from urllib.parse import unquote
|
||||
|
||||
from flask import json
|
||||
from .. import logger as log
|
||||
from .. import logger
|
||||
|
||||
|
||||
log = logger.create()
|
||||
|
||||
|
||||
def b64encode_json(json_data):
|
||||
@ -45,7 +48,8 @@ def to_epoch_timestamp(datetime_object):
|
||||
def get_datetime_from_json(json_object, field_name):
|
||||
try:
|
||||
return datetime.utcfromtimestamp(json_object[field_name])
|
||||
except KeyError:
|
||||
except (KeyError, OSError, OverflowError):
|
||||
# OSError is thrown on Windows if timestamp is <1970 or >2038
|
||||
return datetime.min
|
||||
|
||||
|
||||
|
@ -64,9 +64,11 @@ def init_app(app, config):
|
||||
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
|
||||
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
|
||||
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
|
||||
# app.config['LDAP_CUSTOM_OPTIONS'] = {'OPT_NETWORK_TIMEOUT': 10}
|
||||
|
||||
_ldap.init_app(app)
|
||||
try:
|
||||
_ldap.init_app(app)
|
||||
except RuntimeError as e:
|
||||
log.error(e)
|
||||
|
||||
|
||||
def get_object_details(user=None, group=None, query_filter=None, dn_only=False):
|
||||
|
73
cps/shelf.py
73
cps/shelf.py
@ -27,9 +27,10 @@ from flask import Blueprint, request, flash, redirect, url_for
|
||||
from flask_babel import gettext as _
|
||||
from flask_login import login_required, current_user
|
||||
from sqlalchemy.sql.expression import func
|
||||
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||
|
||||
from . import logger, ub, searched_ids, db, calibre_db
|
||||
from .web import render_title_template
|
||||
from . import logger, ub, searched_ids, calibre_db
|
||||
from .web import login_required_if_no_ano, render_title_template
|
||||
|
||||
|
||||
shelf = Blueprint('shelf', __name__)
|
||||
@ -91,8 +92,16 @@ def add_to_shelf(shelf_id, book_id):
|
||||
|
||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
|
||||
shelf.last_modified = datetime.utcnow()
|
||||
ub.session.merge(shelf)
|
||||
ub.session.commit()
|
||||
try:
|
||||
ub.session.merge(shelf)
|
||||
ub.session.commit()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
if "HTTP_REFERER" in request.environ:
|
||||
return redirect(request.environ["HTTP_REFERER"])
|
||||
else:
|
||||
return redirect(url_for('web.index'))
|
||||
if not xhr:
|
||||
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||
if "HTTP_REFERER" in request.environ:
|
||||
@ -143,9 +152,13 @@ def search_to_shelf(shelf_id):
|
||||
maxOrder = maxOrder + 1
|
||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
|
||||
shelf.last_modified = datetime.utcnow()
|
||||
ub.session.merge(shelf)
|
||||
ub.session.commit()
|
||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||
try:
|
||||
ub.session.merge(shelf)
|
||||
ub.session.commit()
|
||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
else:
|
||||
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
||||
return redirect(url_for('web.index'))
|
||||
@ -180,10 +193,17 @@ def remove_from_shelf(shelf_id, book_id):
|
||||
return redirect(url_for('web.index'))
|
||||
return "Book already removed from shelf", 410
|
||||
|
||||
ub.session.delete(book_shelf)
|
||||
shelf.last_modified = datetime.utcnow()
|
||||
ub.session.commit()
|
||||
|
||||
try:
|
||||
ub.session.delete(book_shelf)
|
||||
shelf.last_modified = datetime.utcnow()
|
||||
ub.session.commit()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
if "HTTP_REFERER" in request.environ:
|
||||
return redirect(request.environ["HTTP_REFERER"])
|
||||
else:
|
||||
return redirect(url_for('web.index'))
|
||||
if not xhr:
|
||||
flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
|
||||
if "HTTP_REFERER" in request.environ:
|
||||
@ -235,7 +255,11 @@ def create_shelf():
|
||||
ub.session.commit()
|
||||
flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success")
|
||||
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
except Exception:
|
||||
ub.session.rollback()
|
||||
flash(_(u"There was an error"), category="error")
|
||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
||||
else:
|
||||
@ -280,7 +304,11 @@ def edit_shelf(shelf_id):
|
||||
try:
|
||||
ub.session.commit()
|
||||
flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success")
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
except Exception:
|
||||
ub.session.rollback()
|
||||
flash(_(u"There was an error"), category="error")
|
||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
|
||||
else:
|
||||
@ -298,17 +326,22 @@ def delete_shelf_helper(cur_shelf):
|
||||
log.info("successfully deleted %s", cur_shelf)
|
||||
|
||||
|
||||
|
||||
@shelf.route("/shelf/delete/<int:shelf_id>")
|
||||
@login_required
|
||||
def delete_shelf(shelf_id):
|
||||
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||
delete_shelf_helper(cur_shelf)
|
||||
try:
|
||||
delete_shelf_helper(cur_shelf)
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
return redirect(url_for('web.index'))
|
||||
|
||||
|
||||
@shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1})
|
||||
@shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>")
|
||||
@login_required
|
||||
@login_required_if_no_ano
|
||||
def show_shelf(shelf_type, shelf_id):
|
||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||
|
||||
@ -327,8 +360,12 @@ def show_shelf(shelf_type, shelf_id):
|
||||
cur_book = calibre_db.get_book(book.book_id)
|
||||
if not cur_book:
|
||||
log.info('Not existing book %s in %s deleted', book.book_id, shelf)
|
||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
|
||||
ub.session.commit()
|
||||
try:
|
||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
|
||||
ub.session.commit()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
||||
shelf=shelf, page="shelf")
|
||||
else:
|
||||
@ -348,7 +385,11 @@ def order_shelf(shelf_id):
|
||||
setattr(book, 'order', to_save[str(book.book_id)])
|
||||
counter += 1
|
||||
# if order diffrent from before -> shelf.last_modified = datetime.utcnow()
|
||||
ub.session.commit()
|
||||
try:
|
||||
ub.session.commit()
|
||||
except (OperationalError, InvalidRequestError):
|
||||
ub.session.rollback()
|
||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||
|
||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||
result = list()
|
||||
|
1
cps/static/css/caliBlur.min.css
vendored
1
cps/static/css/caliBlur.min.css
vendored
@ -2949,7 +2949,6 @@ body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-
|
||||
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover {
|
||||
margin: 0;
|
||||
width: 100%;
|
||||
height: 100%
|
||||
}
|
||||
|
||||
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img {
|
||||
|
11
cps/static/css/libs/bootstrap-table.min.css
vendored
11
cps/static/css/libs/bootstrap-table.min.css
vendored
File diff suppressed because one or more lines are too long
@ -74,8 +74,8 @@ $(function () {
|
||||
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
|
||||
}
|
||||
if ((ggDone === 3 || (ggDone === 1 && ggResults.length === 0)) &&
|
||||
(dbDone === 3 || (ggDone === 1 && dbResults.length === 0)) &&
|
||||
(cvDone === 3 || (ggDone === 1 && cvResults.length === 0))) {
|
||||
(dbDone === 3 || (dbDone === 1 && dbResults.length === 0)) &&
|
||||
(cvDone === 3 || (cvDone === 1 && cvResults.length === 0))) {
|
||||
$("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "</p>");
|
||||
return;
|
||||
}
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fi-FI.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fi-FI.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-CH.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-CH.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-LU.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-LU.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-nl-BE.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-nl-BE.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Cyrl-RS.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Cyrl-RS.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Latn-RS.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Latn-RS.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -88,6 +88,12 @@
|
||||
<div class="col-xs-6 col-sm-6">{{_('Port')}}</div>
|
||||
<div class="col-xs-6 col-sm-6">{{config.config_port}}</div>
|
||||
</div>
|
||||
{% if feature_support['kobo'] and config.config_port != config.config_external_port %}
|
||||
<div class="row">
|
||||
<div class="col-xs-6 col-sm-6">{{_('External Port')}}</div>
|
||||
<div class="col-xs-6 col-sm-6">{{config.config_external_port}}</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="col-xs-12 col-sm-6">
|
||||
<div class="row">
|
||||
|
@ -67,7 +67,7 @@
|
||||
<table class="table" id="identifier-table">
|
||||
{% for identifier in book.identifiers %}
|
||||
<tr>
|
||||
<td><input type="text" class="form-control" name="identifier-type-{{identifier.type}}" value="{{identifier.type}}" required="required" placeholder="{{_('Identifier Type')}}"></td>
|
||||
<td><input type="text" class="form-control" name="identifier-type-{{identifier.type}}" value="{{identifier.type}}" required="required" placeholder="{{_('Identifier Type')}}"></td>
|
||||
<td><input type="text" class="form-control" name="identifier-val-{{identifier.type}}" value="{{identifier.val}}" required="required" placeholder="{{_('Identifier Value')}}"></td>
|
||||
<td><a class="btn btn-default" onclick="removeIdentifierLine(this)">{{_('Remove')}}</a></td>
|
||||
</tr>
|
||||
@ -92,15 +92,19 @@
|
||||
<label for="rating">{{_('Rating')}}</label>
|
||||
<input type="number" name="rating" id="rating" class="rating input-lg" data-clearable="" value="{% if book.ratings %}{{(book.ratings[0].rating / 2)|int}}{% endif %}">
|
||||
</div>
|
||||
{% if g.user.role_upload() or g.user.role_admin()%}
|
||||
{% if g.allow_upload %}
|
||||
<div class="form-group">
|
||||
<label for="cover_url">{{_('Fetch Cover from URL (JPEG - Image will be downloaded and stored in database)')}}</label>
|
||||
<input type="text" class="form-control" name="cover_url" id="cover_url" value="">
|
||||
</div>
|
||||
<div class="form-group" aria-label="Upload cover from local drive">
|
||||
<div class="form-group" aria-label="Upload cover from local drive">
|
||||
<label class="btn btn-primary btn-file" for="btn-upload-cover">{{ _('Upload Cover from Local Disk') }}</label>
|
||||
<div class="upload-cover-input-text" id="upload-cover"></div>
|
||||
<input id="btn-upload-cover" name="btn-upload-cover" type="file" accept=".jpg, .jpeg, .png, .webp">
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
<div class="form-group">
|
||||
<label for="pubdate">{{_('Published Date')}}</label>
|
||||
<div style="position: relative">
|
||||
@ -132,19 +136,20 @@
|
||||
<input type="number" step="{% if c.datatype == 'float' %}0.01{% else %}1{% endif %}" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}" value="{% if book['custom_column_' ~ c.id]|length > 0 %}{{ book['custom_column_' ~ c.id][0].value }}{% endif %}">
|
||||
{% endif %}
|
||||
|
||||
{% if c.datatype in ['text', 'series'] and not c.is_multiple %}
|
||||
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||
value="{{ book['custom_column_' ~ c.id][0].value }}"
|
||||
{% endif %}>
|
||||
{% endif %}
|
||||
|
||||
{% if c.datatype in ['text', 'series'] and c.is_multiple %}
|
||||
{% if c.datatype == 'text' %}
|
||||
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||
value="{% for column in book['custom_column_' ~ c.id] %}{{ column.value.strip() }}{% if not loop.last %}, {% endif %}{% endfor %}"{% endif %}>
|
||||
{% endif %}
|
||||
|
||||
{% if c.datatype == 'series' %}
|
||||
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||
value="{% for column in book['custom_column_' ~ c.id] %} {{ '%s [%s]' % (book['custom_column_' ~ c.id][0].value, book['custom_column_' ~ c.id][0].extra|formatfloat(2)) }}{% if not loop.last %}, {% endif %}{% endfor %}"
|
||||
{% endif %}>
|
||||
{% endif %}
|
||||
|
||||
|
||||
{% if c.datatype == 'enumeration' %}
|
||||
<select class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
||||
<option></option>
|
||||
@ -159,9 +164,9 @@
|
||||
{% endif %}
|
||||
|
||||
{% if c.datatype == 'rating' %}
|
||||
<input type="number" min="1" max="5" step="1" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||
<input type="number" min="1" max="5" step="0.5" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||
value="{{ '%d' % (book['custom_column_' ~ c.id][0].value / 2) }}"
|
||||
value="{{ '%.1f' % (book['custom_column_' ~ c.id][0].value / 2) }}"
|
||||
{% endif %}>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
@ -30,20 +30,20 @@
|
||||
<div data-related="gdrive_settings">
|
||||
{% if gdriveError %}
|
||||
<div class="form-group">
|
||||
<label>
|
||||
<label id="gdrive_error">
|
||||
{{_('Google Drive config problem')}}: {{ gdriveError }}
|
||||
</label>
|
||||
</div>
|
||||
{% else %}
|
||||
{% if show_authenticate_google_drive and g.user.is_authenticated and config.config_use_google_drive %}
|
||||
<div class="form-group required">
|
||||
<a href="{{ url_for('gdrive.authenticate_google_drive') }}" class="btn btn-primary">{{_('Authenticate Google Drive')}}</a>
|
||||
<a href="{{ url_for('gdrive.authenticate_google_drive') }}" id="gdrive_auth" class="btn btn-primary">{{_('Authenticate Google Drive')}}</a>
|
||||
</div>
|
||||
{% else %}
|
||||
{% if show_authenticate_google_drive and g.user.is_authenticated and not config.config_use_google_drive %}
|
||||
<div >{{_('Please hit submit to continue with setup')}}</div>
|
||||
<div >{{_('Please hit save to continue with setup')}}</div>
|
||||
{% endif %}
|
||||
{% if not g.user.is_authenticated %}
|
||||
{% if not g.user.is_authenticated and show_login_button %}
|
||||
<div >{{_('Please finish Google Drive setup after login')}}</div>
|
||||
{% endif %}
|
||||
{% if g.user.is_authenticated %}
|
||||
@ -194,10 +194,14 @@
|
||||
<label for="config_kobo_sync">{{_('Enable Kobo sync')}}</label>
|
||||
</div>
|
||||
<div data-related="kobo-settings">
|
||||
<div class="form-group" style="text-indent:10px;">
|
||||
<div class="form-group" style="margin-left:10px;">
|
||||
<input type="checkbox" id="config_kobo_proxy" name="config_kobo_proxy" {% if config.config_kobo_proxy %}checked{% endif %}>
|
||||
<label for="config_kobo_proxy">{{_('Proxy unknown requests to Kobo Store')}}</label>
|
||||
</div>
|
||||
<div class="form-group" style="margin-left:10px;">
|
||||
<label for="config_external_port">{{_('Server External Port (for port forwarded API calls)')}}</label>
|
||||
<input type="number" min="1" max="65535" class="form-control" name="config_external_port" id="config_external_port" value="{% if config.config_external_port != None %}{{ config.config_external_port }}{% endif %}" autocomplete="off" required>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if feature_support['goodreads'] %}
|
||||
|
@ -81,7 +81,7 @@
|
||||
|
||||
{% for format in entry.data %}
|
||||
{% if format.format|lower in audioentries %}
|
||||
<li><a target="_blank" href="{{ url_for('web.read_book', book_id=entry.id, book_format=format.format|lower) }}">{{format.format}}</a></li>
|
||||
<li><a target="_blank" href="{{ url_for('web.read_book', book_id=entry.id, book_format=format.format|lower) }}">{{format.format|lower }}</a></li>
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
</ul>
|
||||
@ -174,7 +174,7 @@
|
||||
{{ c.name }}:
|
||||
{% for column in entry['custom_column_' ~ c.id] %}
|
||||
{% if c.datatype == 'rating' %}
|
||||
{{ '%d' % (column.value / 2) }}
|
||||
{{ (column.value / 2)|formatfloat }}
|
||||
{% else %}
|
||||
{% if c.datatype == 'bool' %}
|
||||
{% if column.value == true %}
|
||||
@ -182,9 +182,17 @@
|
||||
{% else %}
|
||||
<span class="glyphicon glyphicon-remove"></span>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
{% if c.datatype == 'float' %}
|
||||
{{ column.value|formatfloat(2) }}
|
||||
{% else %}
|
||||
{% if c.datatype == 'series' %}
|
||||
{{ '%s [%s]' % (column.value, column.extra|formatfloat(2)) }}
|
||||
{% else %}
|
||||
{{ column.value }}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% endif %}
|
||||
{% endfor %}
|
||||
{% endif %}
|
||||
|
@ -37,7 +37,7 @@
|
||||
</div>
|
||||
<label for="mail_size">{{_('Attachment Size Limit')}}</label>
|
||||
<div class="form-group input-group">
|
||||
<input type="number" min="1" max="600" step="1" class="form-control" name="mail_size" id="mail_size" value="{% if content.mail_size != None %}{{ (content.mail_size / 1024 / 1024)|int }}{% endif %}">
|
||||
<input type="number" min="1" max="600" step="1" class="form-control" name="mail_size" id="mail_size" value="{% if content.mail_size != None %}{{ (content.mail_size / 1024 / 1024)|int }}{% endif %}" required>
|
||||
<span class="input-group-btn">
|
||||
<button type="button" id="attachement_size" class="btn btn-default" disabled>MB</button>
|
||||
</span>
|
||||
|
@ -5,7 +5,7 @@
|
||||
{{_('Open the .kobo/Kobo eReader.conf file in a text editor and add (or edit):')}}</a>
|
||||
</p>
|
||||
<p>
|
||||
{% if not warning %}'api_endpoint='{{kobo_auth_url}}{% else %}{{warning}}{% endif %}</a>
|
||||
{% if not warning %}api_endpoint={{kobo_auth_url}}{% else %}{{warning}}{% endif %}</a>
|
||||
</p>
|
||||
<p>
|
||||
</div>
|
||||
|
@ -64,7 +64,7 @@
|
||||
<form id="form-upload" class="navbar-form" action="{{ url_for('editbook.upload') }}" method="post" enctype="multipart/form-data">
|
||||
<div class="form-group">
|
||||
<span class="btn btn-default btn-file">{{_('Upload')}}<input id="btn-upload" name="btn-upload"
|
||||
type="file" accept="{% for format in accept %}.{{format}}{{ ',' if not loop.last }}{% endfor %}" multiple></span>
|
||||
type="file" accept="{% for format in accept %}.{% if format != ''%}{{format}}{% else %}*{% endif %}{{ ',' if not loop.last }}{% endfor %}" multiple></span>
|
||||
</div>
|
||||
</form>
|
||||
</li>
|
||||
|
@ -165,7 +165,7 @@
|
||||
{% endif %}
|
||||
|
||||
{% if c.datatype == 'rating' %}
|
||||
<input type="number" min="1" max="5" step="1" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
||||
<input type="number" min="1" max="5" step="0.5" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
|
@ -17,7 +17,7 @@
|
||||
{{entry['series_index']}} - {{entry['series'][0].name}}
|
||||
{% endif %}
|
||||
<br>
|
||||
{% for author in entry['authors'] %}
|
||||
{% for author in entry['author'] %}
|
||||
{{author.name.replace('|',',')}}
|
||||
{% if not loop.last %}
|
||||
&
|
||||
|
@ -25,6 +25,7 @@
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
{% if g.user.role_admin() %}
|
||||
<h3>{{_('Linked Libraries')}}</h3>
|
||||
<table id="libs" class="table">
|
||||
<thead>
|
||||
@ -44,4 +45,5 @@
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
{% endif %}
|
||||
{% endblock %}
|
||||
|
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user