Merge branch 'development'

This commit is contained in:
evilhero 2019-05-08 10:40:32 -04:00
commit 298534958d
24 changed files with 1419 additions and 574 deletions

133
README.md
View File

@ -1,66 +1,103 @@
Mylar is an automated Comic Book (cbr/cbz) downloader program heavily-based on the Headphones template and logic (which is also based on Sick-Beard). ## ![Mylar Logo](https://github.com/evilhero/mylar/blob/master/data/images/mylarlogo.png) Mylar
Yes, it does work, yes there are still bugs, and for that reason I still consider it the definition of an 'Alpha Release'. Mylar is an automated Comic Book (cbr/cbz) downloader program for use with NZB and torrents written in python. It supports SABnzbd, NZBGET, and many torrent clients in addition to DDL.
-REQUIREMENTS- It will allow you to monitor weekly pull-lists for items belonging to user-specific series to download, as well as being able to monitor story-arcs. Support for TPB's and GN's is also now available.
- At least version 2.7.9 Python for proper usage (3.x is not supported).
** NEW **
You will need to get your OWN ComicVine API Key for this application to fully work.
Failure to do this will result in limited (to No) ability when using Mylar.
To start it, type in 'python Mylar.py' from within the root of the mylar directory. Adding a --help option to the command will give a list of available options. This program is considered an "Alpha release" but is in development still. It is not bug-free, but it does work!
Once it's started, navigate to localhost:8090 in your web browser (or whatever IP the machine that has Mylar is on). ## Support & Discussion
You are free to join the Mylar support community on IRC where you can ask questions, hang around and discuss anything related to Mylar.
Here are some helpful hints hopefully: 1. Use any IRC client and connect to the Freenode server, `irc.freenode.net`.
- Add a comic (series) using the Search button, or using the Pullist. 2. Join the `#mylar` channel.
- Make sure you specify Comic Location in the Configuration!
(Mylar auto-creates the Comic Series directories under the Comic Location. The directory is displayed on the Comic Detail page).
- If you make any Configuration changes, shutdown Mylar and restart it or else errors will occur - this is an outstanding bug.
- You need to specify a search-provider in order to perform any search-related function!
- In the Configuration section, if you enable 'Automatically Mark Upcoming Issues as Wanted' it will mark any NEW comic from the Pullist that is on your 'watchlist' as wanted.
- There are times when adding a comic it will fail with an 'Error', submit a bug and it will be checked out (usually an easy fix).
- For the most up-to-date build, use the Development build. Master doesn't get updated as frequently (> month), and Development is usually fairly stable.
The Mylar Forums are also online @ http://forum.mylarcomics.com The Mylar Forums are also online @ https://forum.mylarcomics.com
Please submit issues via git for any outstanding problems that need attention. **Issues** can be reported on the Github issue tracker, provided that you:
- Search existing recent OPEN issues. If an issue is open from a year ago, please don't add to it.
- Always follow the issue template!
- Close your issue when it's solved!
Post-Processing ## Requirements
(Post-Processing is similar to the way SickBeard handles it.) - At least Python version 2.7.9 (3.x is not supported)
- ComicVine API key (found [here](https://comicvine.gamespot.com/api/) - program will have limited to no functionality without it
- UnRaR / RAR is required if metatagging is enabled within the program.
## Usage
To start the program, type `python Mylar.py` inside the root of the Mylar directory. Typing `python Mylar.py --help` will give a list of available options.
Once it's started, navigate to http://localhost:8090 in your web browser (or whatever IP the machine that has Mylar is on).
Helpful hints:
- Ensure `Comic Location` is specified in the configuration (_Configuration --> Web Interface --> Comic Location_)
- Mylar auto-creates the Comic Series directories under the Comic Location. The directory is displayed on the Comic Detail page).
- If you do not want directories to be created until there are issues present, set `create_folders = False` in the config.ini.
- A search provider needs to be specified to perform any search-related functions
- Enabling `Automatically Mark Upcoming Issues as Wanted` in settings will mark any **NEW** comic from the Pullist that is on your 'watchlist' as wanted
- Add a comic (series) using the Search button or via the Pullist.
- If you know the CV comicid, enter the full id into the search box (ie. `4050-XXXXX`)
- If adding a comic fails with "Error", submit a bug and it will be checked out (usually an easy fix)
- Post-Processing is for adding new issues into existing series on your watchlist, Import is for adding files for series that don't exist on your watchlist into your watchlist
- For the most up-to-date build, use the Development build
- Master doesn't get updated as frequently (> month), and Development is usually stable
## Post-processing
It is imperative that you enable the post-processing option if you require post-processing (_Configuration --> Quality & Post-Processing --> Enable Post-Processing_)
### Newsgroups
There are 2 ways to perform post-processing within Mylar, however you cannot use both options simultaneously.
**ComicRN**
- You need to enable the Mylar APIKey for this to work (_Configuration --> Web Interface --> API --> Enable API --> Generate --> Save Configuration_).
- Within the post-processing/ folder of Mylar there are 2 files (autoProcessComics.py and autoProcessComics.cfg.sample) - Within the post-processing/ folder of Mylar there are 2 files (autoProcessComics.py and autoProcessComics.cfg.sample)
- Within the post-processing/ folder of Mylar there are 2 directories (nzbget, sabnzbd) and within each of these client folders, is a ComicRN.py script that is to be used with the respective download client. - Within the post-processing/ folder of Mylar there are 2 directories (nzbget, sabnzbd) and within each of these client folders is a ComicRN.py script that is to be used with the respective download client.
- Edit (put in your Mylar host, port, login and password (if required), and ssl(0 for no, 1 for yes) and rename the autoProcessComics.cfg.sample to autoProcessComics.cfg. - Edit (put in your Mylar host, port, apikey (generated from above), and ssl(0 for no, 1 for yes) and rename the autoProcessComics.cfg.sample to autoProcessComics.cfg.
- Copy autoProcessComics.py, autoProcessComics.cfg and the respective ComicRN.py into your SABnzbd/NZBGet scripts directory (or wherever your download client stores its scripts). - Copy autoProcessComics.py, autoProcessComics.cfg and the respective ComicRN.py into your SABnzbd/NZBGet scripts directory (or wherever your download client stores its scripts).
- Make sure SABnzbd/NZBGet is setup to have a 'comic-related' category that points it to the ComicRN.py script that was just moved. - Make sure SABnzbd/NZBGet is setup to have a 'comic-related' category that points it to the ComicRN.py script that was just moved.
- Ensure in Mylar that the category is named exactly the same. - Ensure in Mylar that the category is named exactly the same.
Renaming **Completed Download Handling (CDH)**
- You can now specify Folder / File Formats. - For the given download client (SABnzbd / NZBGet) simply click on the Enable Completed Download Handling option.
- Folder Format - if left blank, it will default to just using the default Comic Directory [ and creating subdirectories beneath in the format of ComicName-(Year) ] - For SABnzbd to work, you need to make sure you have a version > 0.8.0 (use the Test Connection button for verification)
You can do multi-levels as well - so you could do $Publisher/$Series/$Year to have it setup like DC Comics/Batman/2011 (as an example)
- File Format - if left blank, Mylar will use the original file and not rename at all. This includes replacing spaces, and zero suppression (both renaming features). ### Torrents
- Folder Format IS used on every Add Series / Refresh Series request. Enabling Renaming has no bearing on this, so make sure if you're not using the default, that it's what you want. There is no completed processing for torrents. There are methods available however:
**Torrent client on same machine as Mylar installation**
- _Configuration tab --> Quality & Post-Processing --> Post-Processing_
- set the post-processing action to copy if you want to seed your torrents, otherwise move
- Enable Folder Monitoring
- Folder location to monitor: set to the full location where your finished torrents are downloaded to.
- Folder Monitor Scan Interval: do NOT set this to < 1 minute. Anywhere from 3-5 minutes should be ample.
**Torrent client on different machine than Mylar**
- Use [harpoon](https://github.com/evilhero/harpoon/) to retrieve items back to your local install as soon as they are completed and have post-processing occur immediately (also works with other automated solutions).
- Any other method that involves having the files localized and then have Folder Monitor monitor the location for files.
**Torrent client (rtorrent/deluge) on different machine than Mylar**
- a built-in option for these clients that will monitor for completion and then perform post-processing on the given items.
- files are located in the `post-processing/torrent-auto-snatch` location within the mylar directory.
- read the read.me therein for configuration / setup.
### DDL
When using DDL, post-processing will be initiated immediately upon successful completion. By default the items are downloaded to the cache directory location and removed after post-processing. However, if you wish to change the default directory location, specify the full directory location in the config.ini `ddl_location` field.
## Renaming files and folders
You can specify how Mylar renames files during post-processing / import in addition to the folders.
### Folder Format
- If left blank or as the default value of `ComicName-(Year)`, it will create subdirectories in the format `ComicName-(Year)`
- You can do multiple directory hiearchies as well - so you could do $Publisher/$Series/$Year to have it setup like DC Comics/Wildman/2011 (as an example)
- Folder Format **is** used on every Add Series / Refresh Series request
- Enabling `Renaming` has no bearing on this, so make sure if you're not using the default, that it's what you want.
### File Format
- To enable renaming for files, you need to enable the Rename Files option, otherwise, Mylar will use the original file and not rename at all
- This includes replacing spaces, lowercasing and zero suppression (all renaming features)
Please help make it better, by sending in your bug reports / enhancement requests or just say what's working for you.
The Main page ...
![preview thumb](http://i.imgur.com/GLGMj.png)
The Search page ...
![preview thumb](http://i.imgur.com/EM21C.png)
The Comic Detail page ...
![preview thumb](http://i.imgur.com/6z5mH.png)
![preview thumb](http://i.imgur.com/ETuXp.png)
The Pull page ...
![preview thumb](http://i.imgur.com/VWTDQ.png)
The Config screen ...
![preview thumb](http://i.imgur.com/nQjIN.png)
_<p align="center">You can contribute by sending in your bug reports / enhancement requests.</br>
Telling us what's working helps too!</p>_

View File

@ -237,12 +237,13 @@
<div class="row"> <div class="row">
<label>Volume Number</label> <label>Volume Number</label>
<input type="text" name="comic_version" value="${comic['ComicVersion']}" size="20"><br/> <input type="text" name="comic_version" value="${comic['ComicVersion']}" size="20"><br/>
<small>Format: vN where N is a number 0-infinity</small>
</div> </div>
<div class="row"> <div class="row">
<label>Directory Location</label> <label>Directory Location</label>
<input type="text" name="com_location" value="${comic['ComicLocation']}" size="90"><br/> <input type="text" name="com_location" value="${comic['ComicLocation']}" size="90"><br/>
<small>the directory where all the comics are located for this particular comic</small> <small>The directory where all the comics are located for this particular comic</small>
</div> </div>
<div class="row"> <div class="row">
@ -450,7 +451,7 @@
<a href="#" title="Mark issue as Skipped" onclick="doAjaxCall('unqueueissue?IssueID=${issue['IssueID']}&ComicID=${issue['ComicID']}',$(this),'table')" data-success="'${issue['Issue_Number']}' has been marked as skipped"><img src="interfaces/default/images/skipped_icon.png" height="25" width="25" class="highqual" /></a> <a href="#" title="Mark issue as Skipped" onclick="doAjaxCall('unqueueissue?IssueID=${issue['IssueID']}&ComicID=${issue['ComicID']}',$(this),'table')" data-success="'${issue['Issue_Number']}' has been marked as skipped"><img src="interfaces/default/images/skipped_icon.png" height="25" width="25" class="highqual" /></a>
%else: %else:
<a href="#" title="Retry the same download again" onclick="doAjaxCall('queueit?ComicID=${issue['ComicID']}&IssueID=${issue['IssueID']}&ComicIssue=${issue['Issue_Number']}&mode=want', $(this),'table')" data-success="Retrying the same version of '${issue['ComicName']}' '${issue['Issue_Number']}'"><img src="interfaces/default/images/retry_icon.png" height="25" width="25" class="highqual" /></a> <a href="#" title="Retry the same download again" onclick="doAjaxCall('queueit?ComicID=${issue['ComicID']}&IssueID=${issue['IssueID']}&ComicIssue=${issue['Issue_Number']}&mode=want', $(this),'table')" data-success="Retrying the same version of '${issue['ComicName']}' '${issue['Issue_Number']}'"><img src="interfaces/default/images/retry_icon.png" height="25" width="25" class="highqual" /></a>
<a href="#" title="Mark issue as Skipped" onclick="doAjaxCall('unqueueissue?IssueID=${issue['IssueID']}&ComicID=${issue['ComicID']}',$(this),'table',true);" data-success="'${issue['Issue_Number']}' has been marked as skipped"><img src="interfaces/default/images/skipped_icon.png" height="25" width="25" class="highqual" /></a> <a href="#" title="Mark issue as Skipped" onclick="doAjaxCall('unqueueissue?IssueID=${issue['IssueID']}&ComicID=${issue['ComicID']}',$(this),'table')" data-success="'${issue['Issue_Number']}' has been marked as skipped"><img src="interfaces/default/images/skipped_icon.png" height="25" width="25" class="highqual" /></a>
%endif %endif
<!-- <!--
<a href="#" onclick="doAjaxCall('archiveissue?IssueID=${issue['IssueID']}&comicid=${comic['ComicID']}',$(this),'table')"><img src="interfaces/default/images/archive_icon.png" height="25" width="25" title="Mark issue as Archived" class="highqual" /></a> <a href="#" onclick="doAjaxCall('archiveissue?IssueID=${issue['IssueID']}&comicid=${comic['ComicID']}',$(this),'table')"><img src="interfaces/default/images/archive_icon.png" height="25" width="25" title="Mark issue as Archived" class="highqual" /></a>

249
data/interfaces/default/config.html Executable file → Normal file
View File

@ -502,8 +502,8 @@
</td> </td>
<td> <td>
<legend>Torrents</legend>
<fieldset> <fieldset>
<legend>Torrents</legend>
<div class="row checkbox"> <div class="row checkbox">
<input id="enable_torrents" type="checkbox" onclick="initConfigCheckbox($(this));" name="enable_torrents" value=1 ${config['enable_torrents']} /><label>Use Torrents</label> <input id="enable_torrents" type="checkbox" onclick="initConfigCheckbox($(this));" name="enable_torrents" value=1 ${config['enable_torrents']} /><label>Use Torrents</label>
</div> </div>
@ -654,27 +654,30 @@
<small>Folder path where torrent download will be assigned</small> <small>Folder path where torrent download will be assigned</small>
</div> </div>
</fieldset> </fieldset>
<fieldset id="deluge_options"> <fieldset id="deluge_options">
<div class="row"> <div class="row">
<label>Deluge Host:Port </label> <label>Deluge Host:Port </label>
<input type="text" name="deluge_host" value="${config['deluge_host']}" size="30"> <input type="text" id="deluge_host" name="deluge_host" value="${config['deluge_host']}" size="30">
<small>(ie. 192.168.1.2:58846) port uses the deluge daemon port (remote connection to daemon has to be enabled)</small> <small>(ie. 192.168.1.2:58846) port uses the deluge daemon port (remote connection to daemon has to be enabled)</small>
</div> </div>
<div class="row"> <div class="row">
<label>Deluge Username</label> <label>Deluge Username</label>
<input type="text" name="deluge_username" value="${config['deluge_username']}" size="30"> <input type="text" id="deluge_username" name="deluge_username" value="${config['deluge_username']}" size="30">
</div> </div>
<div class="row"> <div class="row">
<label>Deluge Password</label> <label>Deluge Password</label>
<input type="password" name="deluge_password" value="${config['deluge_password']}" size="30"> <input type="password" id="deluge_password" name="deluge_password" value="${config['deluge_password']}" size="30">
</div> </div>
<div class="row"> <div class="row">
<label>Deluge Label</label> <label>Deluge Label</label>
<input type="text" name="deluge_label" value="${config['deluge_label']}" size="30"><br/> <input type="text" name="deluge_label" value="${config['deluge_label']}" size="30"><br/>
<small>Label to be used on the torrents</small> <small>Label to be used on the torrents</small>
</div> </div>
<div class="row">
<img name="deluge_status_icon" id="deluge_status_icon" src="interfaces/default/images/successs.png" style="float:right;visibility:hidden;" height="20" width="20" />
<input type="button" value="Test Connection" id="deluge_test" /><br/>
</div>
</fieldset> </fieldset>
</fieldset>
<fieldset id="qbittorrent_options"> <fieldset id="qbittorrent_options">
<div class="row"> <div class="row">
<label>qBittorrent Host:Port </label> <label>qBittorrent Host:Port </label>
@ -716,8 +719,9 @@
<img name="qbittorrent_statusicon" id="qbittorrent_statusicon" src="interfaces/default/images/successs.png" style="float:right;visibility:hidden;" height="20" width="20" /> <img name="qbittorrent_statusicon" id="qbittorrent_statusicon" src="interfaces/default/images/successs.png" style="float:right;visibility:hidden;" height="20" width="20" />
<input type="button" value="Test Connection" id="qbittorrent_test" /> <input type="button" value="Test Connection" id="qbittorrent_test" />
</div> </div>
</fieldset> </fieldset>
</div> </div>
</fieldset>
</td> </td>
</tr> </tr>
</table> </table>
@ -1318,53 +1322,6 @@
</div> </div>
</div> </div>
</fieldset> </fieldset>
<fieldset>
<h3><img src="interfaces/default/images/nma_logo.png" style="vertical-align: middle; margin: 3px; margin-top: -1px;" height="30" width="30"/>NotifyMyAndroid</h3>
<div class="checkbox row">
<input type="checkbox" name="nma_enabled" id="nma" value="1" ${config['nma_enabled']} /><label>Enable NotifyMyAndroid</label>
</div>
<div id="nmaoptions">
<div class="row">
<div class="row checkbox">
<input type="checkbox" name="nma_onsnatch" value="1" ${config['nma_onsnatch']} /><label>Notify on snatch?</label>
</div>
<label>NotifyMyAndroid API Key</label>
<input type="text" name="nma_apikey" id="nma_apikey" value="${config['nma_apikey']}" size="30">
<small>Separate multiple api keys with commas</small>
</div>
<div class="row">
<label>Priority</label>
<select name="nma_priority">
%for x in [-2,-1,0,1,2]:
<%
if config['nma_priority'] == x:
nma_priority_selected = 'selected'
else:
nma_priority_selected = ''
if x == -2:
nma_priority_value = 'Very Low'
elif x == -1:
nma_priority_value = 'Moderate'
elif x == 0:
nma_priority_value = 'Normal'
elif x == 1:
nma_priority_value = 'High'
else:
nma_priority_value = 'Emergency'
%>
<option value=${x} ${nma_priority_selected}>${nma_priority_value}</option>
%endfor
</select>
</div>
<div align="center" class="row">
<img name="nma_statusicon" id="nma_statusicon" src="interfaces/default/images/successs.png" style="float:right;visibility:hidden;" height="20" width="20" />
<input type="button" value="Test NMA" id="nma_test" style="float:center" /></br>
<input type="text" name="nmastatus" style="text-align:center; font-size:11px;" id="nmastatus" size="55" DISABLED />
</div>
</div>
</fieldset>
<fieldset> <fieldset>
<h3><img src="interfaces/default/images/pushover_logo.png" style="vertical-align: middle; margin: 3px; margin-top: -1px;" height="30" width="30"/>Pushover</h3> <h3><img src="interfaces/default/images/pushover_logo.png" style="vertical-align: middle; margin: 3px; margin-top: -1px;" height="30" width="30"/>Pushover</h3>
<div class="row checkbox"> <div class="row checkbox">
@ -1490,6 +1447,58 @@
</div> </div>
</fieldset> </fieldset>
<fieldset>
<h3><img src="interfaces/default/images/email.png" style="vertical-align: middle; margin: 3px; margin-top: -1px;" height="30" width="30"/>Email</h3>
<div class="row checkbox">
<input type="checkbox" name="email_enabled" id="email" value="1" ${config['email_enabled']} /><label>Enable Email Notifications</label>
</div>
<div id="emailoptions">
<div class="row">
<label>Sender address</label><input type="email" name="email_from" id="email_from" value="${config['email_from']}" size="50" maxlength="80" style="width: 18em">
<small>sender&#64;hostname</small>
</div>
<div class="row">
<label>Recipient address</label><input type="email" name="email_to" id="email_to" value="${config['email_to']}" size="50" maxlength="80" style="width: 18em">
<small>destination&#64;hostname</small>
</div>
<div class="row">
<label>SMTP server</label><input type="text" name="email_server" id="email_server" value="${config['email_server']}" size="50">
<small>Hostname or IP address of your SMTP server</small>
</div>
<div class="row">
<label>SMTP port</label><input type="number" name="email_port" id="email_port" value="${config['email_port']}" size="5" style="width: 5em">
<small>SMTP port - usually 25, 465 or 587</small>
</div>
<div class="row">
<label>SMTP username</label><input type="text" name="email_user" id="email_user" value="${config['email_user']}" size="50">
<small>Username for SMTP server authentication</small>
</div>
<div class="row">
<label>SMTP password</label><input type="password" name="email_password" id="email_password" value="${config['email_password']}" size="50">
<small>Password for SMTP server authentication</small>
</div>
<input type="radio" style="vertical-align: middle; margin: 3px; margin-top: -1px;" name="email_enc" id="email_raw" value="0" ${config['email_raw']} checked />No encryption&nbsp;&nbsp;
<input type="radio" style="vertical-align: middle; margin: 3px; margin-top: -1px;" name="email_enc" id="email_ssl" value="1" ${config['email_ssl']} />Use SSL?&nbsp;&nbsp;
<input type="radio" style="vertical-align: middle; margin: 3px; margin-top: -1px;" name="email_enc" id="email_tls" value="2" ${config['email_tls']} />Use TLS?
<div class="row checkbox">
<small>SMTP server requires TLS or SSL encryption?</small>
</div>
<div class="row checkbox">
<input type="checkbox" name="email_ongrab" value="1" ${config['email_ongrab']} /><label>Notify on grab?</label>
<small>Notify when comics are grabbed?</small>
</div>
<div class="row checkbox">
<input type="checkbox" name="email_onpost" value="1" ${config['email_onpost']} /><label>Notify on post processing?</label>
<small>Notify when comics are post processed?</small>
</div>
<div align="center" class="row">
<img name="email_statusicon" id="email_statusicon" src="interfaces/default/images/successs.png" style="float:right;visibility:hidden;" height="20" width="20" />
<input type="button" value="Test Email" id="email_test" style="float:center" /></br>
<input type="text" name="emailstatus" style="text-align:center; font-size:11px;" id="emailstatus" size="55" DISABLED />
</div>
</div>
</fieldset>
</td> </td>
</tr> </tr>
</table> </table>
@ -1683,26 +1692,6 @@
} }
}); });
if ($("#nma").is(":checked"))
{
$("#nmaoptions").show();
}
else
{
$("#nmaoptions").hide();
}
$("#nma").click(function(){
if ($("#nma").is(":checked"))
{
$("#nmaoptions").slideDown();
}
else
{
$("#nmaoptions").slideUp();
}
});
if ($("#pushover").is(":checked")) if ($("#pushover").is(":checked"))
{ {
$("#pushoveroptions").show(); $("#pushoveroptions").show();
@ -1763,6 +1752,26 @@
} }
}); });
if ($("#email").is(":checked"))
{
$("#emailoptions").show();
}
else
{
$("#emailoptions").hide();
}
$("#email").click(function(){
if ($("#email").is(":checked"))
{
$("#emailoptions").slideDown();
}
else
{
$("#emailoptions").slideUp();
}
});
if ($("#boxcar").is(":checked")) if ($("#boxcar").is(":checked"))
{ {
$("#boxcaroptions").show(); $("#boxcaroptions").show();
@ -2221,25 +2230,51 @@
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut(); $('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
}); });
$('#deluge_test').click(function () {
var imagechk = document.getElementById("deluge_status_icon");
var host = document.getElementById("deluge_host").value;
var username = document.getElementById("deluge_username").value;
var password = document.getElementById("deluge_password").value;
$.get("testdeluge",
{ host: host, username: username, password: password },
function(data){
if (data.error != undefined) {
alert(data.error);
return;
}
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+data+"</div>");
if ( data.indexOf("Successfully") > -1){
imagechk.src = "";
imagechk.src = "interfaces/default/images/success.png";
imagechk.style.visibility = "visible";
} else {
imagechk.src = "";
imagechk.src = "interfaces/default/images/fail.png";
imagechk.style.visibility = "visible";
}
});
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
});
$(".newznabtest").click(function () { $(".newznabtest").click(function () {
var newznab = this.attributes["name"].value.replace('newznab_test', ''); var newznab = this.attributes["name"].value.replace('newznab_test', '');
if ( newznab.indexOf("test_dognzb") > -1) { if ( newznab.indexOf("test_dognzb") > -1) {
var imagechk = document.getElementById("dognzb_statusicon"); var imagechk = document.getElementById("dognzb_statusicon");
var name = 'DOGnzb'; var name = 'DOGnzb';
var host = 'https://api.dognzb.cr'; var host = 'https://api.dognzb.cr';
var ssl = document.getElementById("dognzb_verify").value; var ssl = document.getElementById("dognzb_verify").checked;
var apikey = document.getElementById("dognzb_apikey").value; var apikey = document.getElementById("dognzb_apikey").value;
} else if ( newznab.indexOf("test_nzbsu") > -1) { } else if ( newznab.indexOf("test_nzbsu") > -1) {
var imagechk = document.getElementById("nzbsu_statusicon"); var imagechk = document.getElementById("nzbsu_statusicon");
var name = 'nzb.su'; var name = 'nzb.su';
var host = 'https://api.nzb.su'; var host = 'https://api.nzb.su';
var ssl = document.getElementById("nzbsu_verify").value; var ssl = document.getElementById("nzbsu_verify").checked;
var apikey = document.getElementById("nzbsu_apikey").value; var apikey = document.getElementById("nzbsu_apikey").value;
} else { } else {
var imagechk = document.getElementById("newznabstatus"+newznab); var imagechk = document.getElementById("newznabstatus"+newznab);
var name = document.getElementById("newznab_name"+newznab).value; var name = document.getElementById("newznab_name"+newznab).value;
var host = document.getElementById("newznab_host"+newznab).value;; var host = document.getElementById("newznab_host"+newznab).value;;
var ssl = document.getElementById("newznab_verify"+newznab).value; var ssl = document.getElementById("newznab_verify"+newznab).checked;
var apikey = document.getElementById("newznab_api"+newznab).value; var apikey = document.getElementById("newznab_api"+newznab).value;
} }
$.get("testnewznab", $.get("testnewznab",
@ -2290,31 +2325,6 @@
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut(); $('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
}); });
$('#nma_test').click(function () {
var imagechk = document.getElementById("nma_statusicon");
var apikey = document.getElementById("nma_apikey").value;
$.get("testNMA",
{ apikey: apikey },
function(data){
if (data.error != undefined) {
alert(data.error);
return;
}
$('#nmastatus').val(data);
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+data+"</div>");
if ( data.indexOf("Successfully") > -1){
imagechk.src = "";
imagechk.src = "interfaces/default/images/success.png";
imagechk.style.visibility = "visible";
} else {
imagechk.src = "";
imagechk.src = "interfaces/default/images/fail.png";
imagechk.style.visibility = "visible";
}
});
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
});
$('#prowl_test').click(function () { $('#prowl_test').click(function () {
var imagechk = document.getElementById("prowl_statusicon"); var imagechk = document.getElementById("prowl_statusicon");
var apikey = document.getElementById("prowl_keys"); var apikey = document.getElementById("prowl_keys");
@ -2467,6 +2477,37 @@
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut(); $('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
}); });
$('#email_test').click(function () {
var imagechk = document.getElementById("email_statusicon");
var emailfrom = document.getElementById("email_from").value;
var emailto = document.getElementById("email_to").value;
var emailsvr = document.getElementById("email_server").value;
var emailport = document.getElementById("email_port").value;
var emailuser = document.getElementById("email_user").value;
var emailpass = document.getElementById("email_password").value;
var emailenc = document.querySelector('input[name="email_enc"]:checked').value; // While this causes a browser crash if no radio button is checked, we're ok because we force the default to be checked in the form.
$.get("testemail",
{ emailfrom: emailfrom, emailto: emailto, emailsvr: emailsvr, emailport: emailport, emailuser: emailuser, emailpass: emailpass, emailenc: emailenc },
function(data){
if (data.error != undefined) {
alert(data.error);
return;
}
$('#emailstatus').val(data);
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+data+"</div>");
if ( data.indexOf("Successfully") > -1){
imagechk.src = "";
imagechk.src = "interfaces/default/images/success.png";
imagechk.style.visibility = "visible";
} else {
imagechk.src = "";
imagechk.src = "interfaces/default/images/fail.png";
imagechk.style.visibility = "visible";
}
});
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
});
$(function() { $(function() {
$( "#tabs" ).tabs(); $( "#tabs" ).tabs();
}); });

View File

@ -1690,6 +1690,62 @@ div#artistheader h2 a {
min-width: 95px; min-width: 95px;
vertical-align: middle; vertical-align: middle;
} }
#queue_table th#qcomicid {
max-width: 10px;
text-align: center;
}
#queue_table th#qseries {
max-width: 475px;
text-align: center;
}
#queue_table th#qsize {
max-width: 30px;
text-align: center;
}
#queue_table th#qprogress {
max-width: 25px;
text-align: center;
}
#queue_table th#qstatus {
max-width: 50px;
text-align: center;
}
#queue_table th#qdate {
max-width: 90px;
text-align: center;
}
#queue_table th#qoptions {
min-width: 160px;
text-align: center;
}
#queue_table td#qcomicid {
max-width: 10px;
text-align: left;
}
#queue_table td#qseries {
max-width: 475px;
text-align: left;
}
#queue_table td#qsize {
max-width: 30px;
text-align: center;
}
#queue_table td#qprogress {
max-width: 25px;
text-align: center;
}
#queue_table td#qstatus {
max-width: 50px;
text-align: center;
}
#queue_table td#qdate {
min-width: 90px;
text-align: center;
}
#queue_table td#qoptions {
min-width: 160px;
text-align: center;
}
DIV.progress-container DIV.progress-container
{ {
position: relative; position: relative;

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

View File

@ -24,6 +24,16 @@
<div class="title"> <div class="title">
<h1 class="clearfix"><img src="interfaces/default/images/icon_logs.png" alt="Logs"/>Logs</h1> <h1 class="clearfix"><img src="interfaces/default/images/icon_logs.png" alt="Logs"/>Logs</h1>
</div> </div>
<div align="center">Refresh rate:
<select id="refreshrate" onchange="setRefresh()">
<option value="0" selected="selected">No Refresh</option>
<option value="5">5 Seconds</option>
<option value="15">15 Seconds</option>
<option value="30">30 Seconds</option>
<option value="60">60 Seconds</option>
<option value="300">5 Minutes</option>
<option value="600">10 Minutes</option>
</select></div>
<table class="display_log" id="log_table"> <table class="display_log" id="log_table">
<thead> <thead>
<tr> <tr>
@ -36,16 +46,7 @@
</tbody> </tbody>
</table> </table>
<br> <br>
<div align="center">Refresh rate:
<select id="refreshrate" onchange="setRefresh()">
<option value="0" selected="selected">No Refresh</option>
<option value="5">5 Seconds</option>
<option value="15">15 Seconds</option>
<option value="30">30 Seconds</option>
<option value="60">60 Seconds</option>
<option value="300">5 Minutes</option>
<option value="600">10 Minutes</option>
</select></div>
</%def> </%def>
<%def name="headIncludes()"> <%def name="headIncludes()">

View File

@ -9,6 +9,7 @@
<a id="menu_link_edit" href="manageComics">Manage Comics</a> <a id="menu_link_edit" href="manageComics">Manage Comics</a>
<a id="menu_link_edit" href="manageIssues?status=Wanted">Manage Issues</a> <a id="menu_link_edit" href="manageIssues?status=Wanted">Manage Issues</a>
<a id="menu_link_edit" href="manageFailed">Manage Failed Links</a> <a id="menu_link_edit" href="manageFailed">Manage Failed Links</a>
<a id="menu_link_edit" href="queueManage">Manage DDL Queue</a>
</div> </div>
</div> </div>
</%def> </%def>

View File

@ -41,6 +41,7 @@
<th id="name">Comic Name</th> <th id="name">Comic Name</th>
<th id="status">Status</th> <th id="status">Status</th>
<th id="stat_icon"></th> <th id="stat_icon"></th>
<th class="hidden" id="stat_title"></th>
<th id="latest">Latest Issue</th> <th id="latest">Latest Issue</th>
<th id="publisher">Publisher</th> <th id="publisher">Publisher</th>
<th id="have">Have</th> <th id="have">Have</th>
@ -66,15 +67,16 @@
<td id="status">${comic['recentstatus']}</td> <td id="status">${comic['recentstatus']}</td>
<td id="stat_icon"> <td id="stat_icon">
%if comic['Status'] == 'Paused': %if comic['Status'] == 'Paused':
<img src="interfaces/default/images/pause-icon.png" alt="Paused" width="16" height="16" /> <img src="interfaces/default/images/pause-icon.png" title="Paused" alt="Paused" width="16" height="16" />
%elif comic['Status'] == 'Loading': %elif comic['Status'] == 'Loading':
<img src="interfaces/default/images/hourglass.png" alt="Loading" width="16" height="16" /> <img src="interfaces/default/images/hourglass.png" title="Loading" alt="Loading" width="16" height="16" />
%elif comic['Status'] == 'Error': %elif comic['Status'] == 'Error':
<img src="interfaces/default/images/cross.png" alt="Error" width="16" height="16" /> <img src="interfaces/default/images/cross.png" title="Error" alt="Error" width="16" height="16" />
%else: %else:
<img src="interfaces/default/images/checkmark.png" alt="Active" width="16" height="16" /> <img src="interfaces/default/images/checkmark.png" title="Active" alt="Active" width="16" height="16" />
%endif %endif
</td> </td>
<td class="hidden" id="stat_title">${comic['Status']}</td>
<td id="latest">${comic['LatestIssue']} (${comic['LatestDate']})</td> <td id="latest">${comic['LatestIssue']} (${comic['LatestDate']})</td>
<td id="publisher">${comic['ComicPublisher']}</td> <td id="publisher">${comic['ComicPublisher']}</td>
<td id="have" valign="center"><span title="${comic['percent']}"></span><div class="progress-container"><div style="width:${comic['percent']}%"><div style="width:${comic['percent']}%"><div class="havetracks">${comic['haveissues']}/${comic['totalissues']}</div></div></div></td> <td id="have" valign="center"><span title="${comic['percent']}"></span><div class="progress-container"><div style="width:${comic['percent']}%"><div style="width:${comic['percent']}%"><div class="havetracks">${comic['haveissues']}/${comic['totalissues']}</div></div></div></td>

View File

@ -0,0 +1,258 @@
<%inherit file="base.html"/>
<%
import mylar
%>
<%def name="headerIncludes()">
<div id="subhead_container">
<div id="subhead_menu">
<a id="menu_link_refresh" href="#" title="Restart stalled queue" onclick="doAjaxCall('ddl_requeue?mode=restart_queue',$(this),'table')" data-success="Restarted Queue">Restart Queue</a>
</div>
</div>
</%def>
<%def name="body()">
<div id="paddingheader">
<h1 class="clearfix">QUEUE MANAGEMENT</h1></br>
</div>
<div style="text-align:center;">
<h2><center><bold>ACTIVE</bold></center></h2>
<div id="active_message" align="center" style="display:none;">
<div id="amessage" align="center"></div>
<div id="queuebuttons"><div id="queue_menu" style="display:none;"></br>
<a id="arestartddl" name="restartddl" href="#" title="Restart stalled download" data-success="Restarted Download">Restart Download</a>
<a id="aresumeddl" name="resumeddl" href="#" title="Resume download" data-success="Resumed Download">Resume Download</a>
<a id="aabortddl" name="abortddl" href="#" title="Abort download" data-success="Aborted Download">Abort Download</a>
</div></div>
</div>
<div id="active_queue" style="display:none;">
<table width="100%" cellpadding="5" cellspacing="5">
<tbody>
<div style="display: flex; justify-content: flex-end">
<div id="queuebuttons"><div id="queue_menu">
<a id="qrestartddl" name="restartddl" href="#" title="Restart stalled download" data-success="Restarted Download">Restart Download</a></br>
<a id="qresumeddl" name="resumeddl" href="#" title="Resume download" data-success="Resumed Download">Resume Download</a></br>
<a id="qabortddl" name="abortddl" href="#" title="Abort download" data-success="Aborted Download">Abort Download</a>
</div></div>
</div>
<tr><td id="series" align="center" style="text-align:center"></td></tr>
<tr><td id="filename" align="center" style="text-align:center"></td></tr>
<tr><td id="size" align="center" style="text-align:center"></td></tr>
<tr><td align="center" style="text-align:center">
<div style="display:table;position:relative;margin:auto;top:0px;"><span id="progress_percent"></span><div class="progress-container complete"><div id="prog_width"><span class="progressbar-front-text" style="margin:auto;top:-3px;" id="progress" name="progress" value="0%"></span></div></div></div>
</td></tr>
<tr><td id="status" align="center" style="text-align:center"></td></tr>
</tbody>
</table>
</div>
</div>
</br></br>
<h2><center><bold>HISTORY</bold></center></h2>
%if type(resultlist) == str:
<center>${resultlist}</center></br>
%else:
<div class="table_wrapper">
<table class="display" id="queue_table">
<thead>
<tr>
<th id="qcomicid">ComicID</th>
<th id="qseries">Series</th>
<th id="qsize">Size</th>
<th id="qprogress">%</th>
<th id="qstatus">Status</th>
<th id="qdate">Updated</th>
<th id="qoptions">Options</th>
</tr>
</thead>
<tbody>
</tbody>
</table>
</div>
%endif
</%def>
<%def name="headIncludes()">
<link rel="stylesheet" href="interfaces/default/css/data_table.css">
</%def>
<%def name="javascriptIncludes()">
<script src="js/libs/jquery.dataTables.min.js"></script>
<script src="js/libs/full_numbers_no_ellipses.js"></script>
<script>
var ImportTimer = setInterval(activecheck, 5000);
function activecheck() {
$.get('check_ActiveDDL',
function(data){
if (data.error != undefined) {
alert(data.error);
return;
};
var obj = JSON.parse(data);
var percent = obj['percent'];
var status = obj['status'];
var aid = obj['a_id'];
if (status == 'Downloading') {
acm = document.getElementById("active_message");
acm.style.display = "none";
acq = document.getElementById("active_queue");
acq.style.display = "unset";
document.getElementById("prog_width").style.width=percent;
$("#progress").html(percent);
document.getElementById("series").innerHTML = obj['a_series'];
document.getElementById("filename").innerHTML = obj['a_filename'];
document.getElementById("size").innerHTML = obj['a_size'];
document.getElementById("status").innerHTML = status;
qmm = document.getElementById("queue_menu");
qmm.style.display = "inline-block";
$("#qrestartddl").attr('onClick', "ajaxcallit('restart', '"+aid+"')");
$("#qresumeddl").attr('onClick', "ajaxcallit('resume', '"+aid+"')");
$("#qabortddl").attr('onClick', "ajaxcallit('abort', '"+aid+"')");
} else if ( status.indexOf("File does not exist") > -1){
acm = document.getElementById("active_message");
acm.style.display = "inline-block";
acq = document.getElementById("active_queue");
acq.style.display = "none";
document.getElementById("amessage").innerHTML = status;
qmm = document.getElementById("queue_menu");
qmm.style.display = "inline-block";
$("#arestartddl").attr('onClick', "ajaxcallit('restart', '"+aid+"')");
$("#aresumeddl").attr('onClick', "ajaxcallit('resume', '"+aid+"')");
$("#aabortddl").attr('onClick', "ajaxcallit('abort', '"+aid+"')");
} else {
acm = document.getElementById("active_message");
acm.style.display = "inline-block";
acq = document.getElementById("active_queue");
acq.style.display = "none";
qmm = document.getElementById("queue_menu");
qmm.style.display = "none";
document.getElementById("amessage").innerHTML = status;
}
if (percent == '100%') {
clearInterval(ImportTimer);
$('#queue_table').DataTable().ajax.reload(null, false);
ImportTimer = setInterval(activecheck, 5000);
}
});
};
function ajaxcallit(qmode, queueid) {
$.get("ddl_requeue",
{ mode: qmode, id: queueid },
function(data){
if (data.error != undefined) {
alert(data.error);
return;
}
var objd = JSON.parse(data);
if (qmode == 'restart') {
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+objd['message']+"</div>");
} else if (qmode == 'resume') {
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+objd['message']+"</div>");
} else if (qmode == 'abort') {
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+objd['message']+"</div>");
} else if (qmode == 'remove') {
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+objd['message']+"</div>");
} else if (qmode == 'restart-queue') {
$('#ajaxMsg').html("<div class='msg'><span class='ui-icon ui-icon-check'></span>"+objd['message']+"</div>");
}
});
$('#ajaxMsg').addClass('success').fadeIn().delay(3000).fadeOut();
$('#queue_table').DataTable().ajax.reload(null, false);
};
function initThisPage() {
initActions();
$("#queuebuttons #queue_menu #arestartddl, #qrestartddl").button({ icons: { primary: "ui-icon-refresh" } });
$("#queuebuttons #queue_menu #aresumeddl, #qresumeddl").button({ icons: { primary: "ui-icon-pencil" } });
$("#queuebuttons #queue_menu #aabortddl, #qabortddl").button({ icons: { primary: "ui-icon-pencil" } });
if ( $.fn.dataTable.isDataTable( '#queue_table' ) ) {
$('#queue_table').DataTable().ajax.reload(null, false);
} else {
$('#queue_table').DataTable( {
"processing": true,
"serverSide": true,
"ajaxSource": 'queueManageIt',
"paginationType": "full_numbers",
"sorting": [[0, 'desc']],
"displayLength": 15,
"stateSave": false,
"columnDefs": [
{
"sortable": false,
"targets": [ 0 ],
"visible": false,
},
{
"sortable": true,
"targets": [ 1 ],
"visible": true,
"data": "Series",
"render":function (data,type,full) {
return '<span title="' + full[1] + '"></span><a href="comicDetails?ComicID=' + full[0] + '">' + full[1] + '</a>';
}
},
{
"sortable": false,
"targets": [ 6 ],
"visible": true,
"render":function (data,type,full) {
val = full[4]
var restartline = "('ddl_requeue?mode=restart&id="+String(full[6])+"',$(this));"
var resumeline = "('ddl_requeue?mode=resume&id="+String(full[6])+"',$(this));"
var removeline = "('ddl_requeue?mode=remove&id="+String(full[6])+"',$(this));"
if (val == 'Completed' || val == 'Failed' || val == 'Downloading'){
return '<span title="Restart"></span><a href="#" onclick="doAjaxCall'+restartline+'">Restart</a><span title="Remove"></span><a href="#" onclick="doAjaxCall'+removeline+'"><span class="ui-icon ui-icon-plus"></span>Remove</a>';
} else if (val == 'Incomplete') {
return '<span title="Restart"></span><a href="#" onclick="doAjaxCall'+restartline+'">Restart</a><span title="Resume"></span><a href="#" onclick="doAjaxCall'+resumeline+'"><span class="ui-icon ui-icon-plus"></span>Resume</a>';
} else if (val == 'Queued') {
return '<span title="Start"></span><a href="#" onclick="doAjaxCall'+restartline+'">Start</a>';
}
}
},
],
"language": {
"search":"Filter:",
"lengthMenu":"Show _MENU_ items per page",
"emptyTable": "No information available",
"info":"Showing _START_ to _END_ of _TOTAL_ items",
"infoEmpty":"Showing 0 to 0 of 0 lines",
"infoFiltered":"(filtered from _MAX_ total items)"
},
"rowCallback": function (nRow, aData, iDisplayIndex, iDisplayIndexFull) {
if (aData[4] === "Completed") {
$('td', nRow).closest('tr').addClass("gradeA");
} else if (aData[4] === "Queued") {
$('td', nRow).closest('tr').addClass("gradeL");
} else if (aData[4] === "Incomplete" || aData[4] == "Failed") {
$('td', nRow).closest('tr').addClass("gradeX");
}
nRow.children[0].id = 'qcomicid';
nRow.children[1].id = 'qseries';
nRow.children[2].id = 'qsize';
nRow.children[3].id = 'qprogress';
nRow.children[4].id = 'qstatus';
nRow.children[5].id = 'qdate';
return nRow;
},
"drawCallback": function (o) {
// Jump to top of page
$('html,body').scrollTop(0);
},
"serverData": function ( sSource, aoData, fnCallback ) {
/* Add some extra data to the sender */
$.getJSON(sSource, aoData, function (json) {
fnCallback(json)
});
},
"fnInitComplete": function(oSettings, json)
{
},
});
};
activecheck();
};
$(document).ready(function() {
initThisPage();
});
</script>
</%def>

View File

@ -2,17 +2,21 @@ import logging
import random import random
import re import re
import subprocess import subprocess
from copy import deepcopy import copy
from time import sleep import time
from requests.sessions import Session from requests.sessions import Session
from collections import OrderedDict
try: try:
from urlparse import urlparse from urlparse import urlparse
from urlparse import urlunparse
except ImportError: except ImportError:
from urllib.parse import urlparse from urllib.parse import urlparse
from urllib.parse import urlunparse
__version__ = "1.9.5" __version__ = "1.9.7"
DEFAULT_USER_AGENTS = [ DEFAULT_USER_AGENTS = [
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/65.0.3325.181 Safari/537.36",
@ -24,8 +28,6 @@ DEFAULT_USER_AGENTS = [
"Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:57.0) Gecko/20100101 Firefox/57.0" "Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:57.0) Gecko/20100101 Firefox/57.0"
] ]
DEFAULT_USER_AGENT = random.choice(DEFAULT_USER_AGENTS)
BUG_REPORT = """\ BUG_REPORT = """\
Cloudflare may have changed their technique, or there may be a bug in the script. Cloudflare may have changed their technique, or there may be a bug in the script.
@ -45,12 +47,13 @@ https://github.com/Anorov/cloudflare-scrape/issues\
class CloudflareScraper(Session): class CloudflareScraper(Session):
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
self.delay = kwargs.pop("delay", 8) self.default_delay = 8
self.delay = kwargs.pop("delay", self.default_delay)
super(CloudflareScraper, self).__init__(*args, **kwargs) super(CloudflareScraper, self).__init__(*args, **kwargs)
if "requests" in self.headers["User-Agent"]: if "requests" in self.headers["User-Agent"]:
# Set a random User-Agent if no custom User-Agent has been set # Set a random User-Agent if no custom User-Agent has been set
self.headers["User-Agent"] = DEFAULT_USER_AGENT self.headers["User-Agent"] = random.choice(DEFAULT_USER_AGENTS)
def is_cloudflare_challenge(self, resp): def is_cloudflare_challenge(self, resp):
return ( return (
@ -61,6 +64,19 @@ class CloudflareScraper(Session):
) )
def request(self, method, url, *args, **kwargs): def request(self, method, url, *args, **kwargs):
self.headers = (
OrderedDict(
[
('User-Agent', self.headers['User-Agent']),
('Accept', 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'),
('Accept-Language', 'en-US,en;q=0.5'),
('Accept-Encoding', 'gzip, deflate'),
('Connection', 'close'),
('Upgrade-Insecure-Requests', '1')
]
)
)
resp = super(CloudflareScraper, self).request(method, url, *args, **kwargs) resp = super(CloudflareScraper, self).request(method, url, *args, **kwargs)
# Check if Cloudflare anti-bot is on # Check if Cloudflare anti-bot is on
@ -70,22 +86,22 @@ class CloudflareScraper(Session):
return resp return resp
def solve_cf_challenge(self, resp, **original_kwargs): def solve_cf_challenge(self, resp, **original_kwargs):
sleep(self.delay) # Cloudflare requires a delay before solving the challenge start_time = time.time()
body = resp.text body = resp.text
parsed_url = urlparse(resp.url) parsed_url = urlparse(resp.url)
domain = parsed_url.netloc domain = parsed_url.netloc
submit_url = "%s://%s/cdn-cgi/l/chk_jschl" % (parsed_url.scheme, domain) submit_url = "%s://%s/cdn-cgi/l/chk_jschl" % (parsed_url.scheme, domain)
cloudflare_kwargs = deepcopy(original_kwargs) cloudflare_kwargs = copy.deepcopy(original_kwargs)
params = cloudflare_kwargs.setdefault("params", {}) params = cloudflare_kwargs.setdefault("params", {})
headers = cloudflare_kwargs.setdefault("headers", {}) headers = cloudflare_kwargs.setdefault("headers", {})
headers["Referer"] = resp.url headers["Referer"] = resp.url
try: try:
params["s"] = re.search(r'name="s"\svalue="(?P<s_value>[^"]+)', body).group('s_value')
params["jschl_vc"] = re.search(r'name="jschl_vc" value="(\w+)"', body).group(1) params["jschl_vc"] = re.search(r'name="jschl_vc" value="(\w+)"', body).group(1)
params["pass"] = re.search(r'name="pass" value="(.+?)"', body).group(1) params["pass"] = re.search(r'name="pass" value="(.+?)"', body).group(1)
except Exception as e: except Exception as e:
# Something is wrong with the page. # Something is wrong with the page.
# This may indicate Cloudflare has changed their anti-bot # This may indicate Cloudflare has changed their anti-bot
@ -96,16 +112,28 @@ class CloudflareScraper(Session):
# Solve the Javascript challenge # Solve the Javascript challenge
params["jschl_answer"] = self.solve_challenge(body, domain) params["jschl_answer"] = self.solve_challenge(body, domain)
# Check if the default delay has been overridden. If not, use the delay required by
# cloudflare.
if self.delay == self.default_delay:
try:
self.delay = float(re.search(r"submit\(\);\r?\n\s*},\s*([0-9]+)", body).group(1)) / float(1000)
except:
pass
# Requests transforms any request into a GET after a redirect, # Requests transforms any request into a GET after a redirect,
# so the redirect has to be handled manually here to allow for # so the redirect has to be handled manually here to allow for
# performing other types of requests even as the first request. # performing other types of requests even as the first request.
method = resp.request.method method = resp.request.method
cloudflare_kwargs["allow_redirects"] = False cloudflare_kwargs["allow_redirects"] = False
redirect = self.request(method, submit_url, **cloudflare_kwargs)
end_time = time.time()
# Cloudflare requires a delay before solving the challenge
time.sleep(self.delay - (end_time - start_time))
redirect = self.request(method, submit_url, **cloudflare_kwargs)
redirect_location = urlparse(redirect.headers["Location"]) redirect_location = urlparse(redirect.headers["Location"])
if not redirect_location.netloc: if not redirect_location.netloc:
redirect_url = "%s://%s%s" % (parsed_url.scheme, domain, redirect_location.path) redirect_url = urlunparse((parsed_url.scheme, domain, redirect_location.path, redirect_location.params, redirect_location.query, redirect_location.fragment))
return self.request(method, redirect_url, **original_kwargs) return self.request(method, redirect_url, **original_kwargs)
return self.request(method, redirect.headers["Location"], **original_kwargs) return self.request(method, redirect.headers["Location"], **original_kwargs)
@ -116,8 +144,15 @@ class CloudflareScraper(Session):
except Exception: except Exception:
raise ValueError("Unable to identify Cloudflare IUAM Javascript on website. %s" % BUG_REPORT) raise ValueError("Unable to identify Cloudflare IUAM Javascript on website. %s" % BUG_REPORT)
js = re.sub(r"a\.value = (.+ \+ t\.length).+", r"\1", js) js = re.sub(r"a\.value = (.+\.toFixed\(10\);).+", r"\1", js)
js = re.sub(r"\s{3,}[a-z](?: = |\.).+", "", js).replace("t.length", str(len(domain))) # Match code that accesses the DOM and remove it, but without stripping too much.
try:
solution_name = re.search("s,t,o,p,b,r,e,a,k,i,n,g,f,\s*(.+)\s*=", js).groups(1)
match = re.search("(.*};)\n\s*(t\s*=(.+))\n\s*(;%s.*)" % (solution_name), js, re.M | re.I | re.DOTALL).groups()
js = match[0] + match[-1]
except Exception:
raise ValueError("Error parsing Cloudflare IUAM Javascript challenge. %s" % BUG_REPORT)
js = js.replace("t.length", str(len(domain)))
# Strip characters that could be used to exit the string context # Strip characters that could be used to exit the string context
# These characters are not currently used in Cloudflare's arithmetic snippet # These characters are not currently used in Cloudflare's arithmetic snippet
@ -126,9 +161,30 @@ class CloudflareScraper(Session):
if "toFixed" not in js: if "toFixed" not in js:
raise ValueError("Error parsing Cloudflare IUAM Javascript challenge. %s" % BUG_REPORT) raise ValueError("Error parsing Cloudflare IUAM Javascript challenge. %s" % BUG_REPORT)
# 2019-03-20: Cloudflare sometimes stores part of the challenge in a div which is later
# added using document.getElementById(x).innerHTML, so it is necessary to simulate that
# method and value.
try:
# Find the id of the div in the javascript code.
k = re.search(r"k\s+=\s+'([^']+)';", body).group(1)
# Find the div with that id and store its content.
val = re.search(r'<div(.*)id="%s"(.*)>(.*)</div>' % (k), body).group(3)
except Exception:
# If not available, either the code has been modified again, or the old
# style challenge is used.
k = ''
val = ''
# Use vm.runInNewContext to safely evaluate code # Use vm.runInNewContext to safely evaluate code
# The sandboxed code cannot use the Node.js standard library # The sandboxed code cannot use the Node.js standard library
js = "console.log(require('vm').runInNewContext('%s', Object.create(null), {timeout: 5000}));" % js # Add the atob method which is now used by Cloudflares code, but is not available in all node versions.
simulate_document_js = 'var document= {getElementById: function(x) { return {innerHTML:"%s"};}}' % (val)
atob_js = 'var atob = function(str) {return Buffer.from(str, "base64").toString("binary");}'
# t is not defined, so we have to define it and set it to the domain name.
js = '%s;%s;var t="%s";%s' % (simulate_document_js,atob_js,domain,js)
buffer_js = "var Buffer = require('buffer').Buffer"
# Pass Buffer into the new context, so it is available for atob.
js = "%s;console.log(require('vm').runInNewContext('%s', {'Buffer':Buffer,'g':String.fromCharCode}, {timeout: 5000}));" % (buffer_js, js)
try: try:
result = subprocess.check_output(["node", "-e", js]).strip() result = subprocess.check_output(["node", "-e", js]).strip()

View File

@ -693,6 +693,7 @@ class PostProcessor(object):
for isc in issuechk: for isc in issuechk:
datematch = "True" datematch = "True"
datechkit = False
if isc['ReleaseDate'] is not None and isc['ReleaseDate'] != '0000-00-00': if isc['ReleaseDate'] is not None and isc['ReleaseDate'] != '0000-00-00':
try: try:
if isc['DigitalDate'] != '0000-00-00' and int(re.sub('-', '', isc['DigitalDate']).strip()) <= int(re.sub('-', '', isc['ReleaseDate']).strip()): if isc['DigitalDate'] != '0000-00-00' and int(re.sub('-', '', isc['DigitalDate']).strip()) <= int(re.sub('-', '', isc['ReleaseDate']).strip()):
@ -743,18 +744,21 @@ class PostProcessor(object):
logger.fdebug('%s[ISSUE-VERIFY] %s is before the issue year %s that was discovered in the filename' % (module, isc['IssueDate'], watchmatch['issue_year'])) logger.fdebug('%s[ISSUE-VERIFY] %s is before the issue year %s that was discovered in the filename' % (module, isc['IssueDate'], watchmatch['issue_year']))
datematch = "False" datematch = "False"
if int(monthval[5:7]) == 11 or int(monthval[5:7]) == 12: if int(watch_issueyear) != int(watchmatch['issue_year']):
issyr = int(monthval[:4]) + 1 if int(monthval[5:7]) == 11 or int(monthval[5:7]) == 12:
logger.fdebug('%s[ISSUE-VERIFY] IssueYear (issyr) is %s' % (module, issyr)) issyr = int(monthval[:4]) + 1
elif int(monthval[5:7]) == 1 or int(monthval[5:7]) == 2 or int(monthval[5:7]) == 3: logger.fdebug('%s[ISSUE-VERIFY] IssueYear (issyr) is %s' % (module, issyr))
issyr = int(monthval[:4]) - 1 datechkit = True
elif int(monthval[5:7]) == 1 or int(monthval[5:7]) == 2 or int(monthval[5:7]) == 3:
issyr = int(monthval[:4]) - 1
datechkit = True
if datematch == "False" and issyr is not None: if datechkit is True and issyr is not None:
logger.fdebug('%s[ISSUE-VERIFY] %s comparing to %s : rechecking by month-check versus year.' % (module, issyr, watchmatch['issue_year'])) logger.fdebug('%s[ISSUE-VERIFY] %s comparing to %s : rechecking by month-check versus year.' % (module, issyr, watchmatch['issue_year']))
datematch = "True" datematch = "True"
if int(issyr) != int(watchmatch['issue_year']): if int(issyr) != int(watchmatch['issue_year']):
logger.fdebug('%s[ISSUE-VERIFY][.:FAIL:.] Issue is before the modified issue year of %s' % (module, issyr)) logger.fdebug('%s[ISSUE-VERIFY][.:FAIL:.] Issue is before the modified issue year of %s' % (module, issyr))
datematch = "False" datematch = "False"
else: else:
if fcdigit is None: if fcdigit is None:
@ -763,6 +767,8 @@ class PostProcessor(object):
logger.info('%s[ISSUE-VERIFY] Found matching issue # %s for ComicID: %s / IssueID: %s' % (module, fcdigit, cs['ComicID'], isc['IssueID'])) logger.info('%s[ISSUE-VERIFY] Found matching issue # %s for ComicID: %s / IssueID: %s' % (module, fcdigit, cs['ComicID'], isc['IssueID']))
if datematch == "True": if datematch == "True":
#need to reset this to False here so that the True doesn't carry down and avoid the year checks due to the True
datematch = "False"
# if we get to here, we need to do some more comparisons just to make sure we have the right volume # if we get to here, we need to do some more comparisons just to make sure we have the right volume
# first we chk volume label if it exists, then we drop down to issue year # first we chk volume label if it exists, then we drop down to issue year
# if the above both don't exist, and there's more than one series on the watchlist (or the series is > v1) # if the above both don't exist, and there's more than one series on the watchlist (or the series is > v1)
@ -1023,6 +1029,7 @@ class PostProcessor(object):
else: else:
for isc in issuechk: for isc in issuechk:
datematch = "True" datematch = "True"
datechkit = False
if isc['ReleaseDate'] is not None and isc['ReleaseDate'] != '0000-00-00': if isc['ReleaseDate'] is not None and isc['ReleaseDate'] != '0000-00-00':
try: try:
if isc['DigitalDate'] != '0000-00-00' and int(re.sub('-', '', isc['DigitalDate']).strip()) <= int(re.sub('-', '', isc['ReleaseDate']).strip()): if isc['DigitalDate'] != '0000-00-00' and int(re.sub('-', '', isc['DigitalDate']).strip()) <= int(re.sub('-', '', isc['ReleaseDate']).strip()):
@ -1085,18 +1092,21 @@ class PostProcessor(object):
logger.fdebug('%s[ARC ISSUE-VERIFY] %s is before the issue year %s that was discovered in the filename' % (module, isc['IssueDate'], arcmatch['issue_year'])) logger.fdebug('%s[ARC ISSUE-VERIFY] %s is before the issue year %s that was discovered in the filename' % (module, isc['IssueDate'], arcmatch['issue_year']))
datematch = "False" datematch = "False"
if int(monthval[5:7]) == 11 or int(monthval[5:7]) == 12: if int(arc_issueyear) != int(arcmatch['issue_year']):
issyr = int(monthval[:4]) + 1 if int(monthval[5:7]) == 11 or int(monthval[5:7]) == 12:
logger.fdebug('%s[ARC ISSUE-VERIFY] IssueYear (issyr) is %s' % (module, issyr)) issyr = int(monthval[:4]) + 1
elif int(monthval[5:7]) == 1 or int(monthval[5:7]) == 2 or int(monthval[5:7]) == 3: datechkit = True
issyr = int(monthval[:4]) - 1 logger.fdebug('%s[ARC ISSUE-VERIFY] IssueYear (issyr) is %s' % (module, issyr))
elif int(monthval[5:7]) == 1 or int(monthval[5:7]) == 2 or int(monthval[5:7]) == 3:
issyr = int(monthval[:4]) - 1
datechkit = True
if datematch == "False" and issyr is not None: if datechkit is True and issyr is not None:
logger.fdebug('%s[ARC ISSUE-VERIFY] %s comparing to %s : rechecking by month-check versus year.' % (module, issyr, arcmatch['issue_year'])) logger.fdebug('%s[ARC ISSUE-VERIFY] %s comparing to %s : rechecking by month-check versus year.' % (module, issyr, arcmatch['issue_year']))
datematch = "True" datematch = "True"
if int(issyr) != int(arcmatch['issue_year']): if int(issyr) != int(arcmatch['issue_year']):
logger.fdebug('%s[.:FAIL:.] Issue is before the modified issue year of %s' % (module, issyr)) logger.fdebug('%s[.:FAIL:.] Issue is before the modified issue year of %s' % (module, issyr))
datematch = "False" datematch = "False"
else: else:
if fcdigit is None: if fcdigit is None:
@ -1108,7 +1118,8 @@ class PostProcessor(object):
logger.fdebug('temploc: %s' % helpers.issuedigits(temploc)) logger.fdebug('temploc: %s' % helpers.issuedigits(temploc))
logger.fdebug('arcissue: %s' % helpers.issuedigits(v[i]['ArcValues']['IssueNumber'])) logger.fdebug('arcissue: %s' % helpers.issuedigits(v[i]['ArcValues']['IssueNumber']))
if datematch == "True" and helpers.issuedigits(temploc) == helpers.issuedigits(v[i]['ArcValues']['IssueNumber']): if datematch == "True" and helpers.issuedigits(temploc) == helpers.issuedigits(v[i]['ArcValues']['IssueNumber']):
#reset datematch here so it doesn't carry the value down and avoid year checks
datematch = "False"
arc_values = v[i]['WatchValues'] arc_values = v[i]['WatchValues']
if any([arc_values['ComicVersion'] is None, arc_values['ComicVersion'] == 'None']): if any([arc_values['ComicVersion'] is None, arc_values['ComicVersion'] == 'None']):
tmp_arclist_vol = '1' tmp_arclist_vol = '1'
@ -1275,8 +1286,19 @@ class PostProcessor(object):
if temploc is not None and fcdigit == helpers.issuedigits(ofv['Issue_Number']) or all([temploc is None, helpers.issuedigits(ofv['Issue_Number']) == '1']): if temploc is not None and fcdigit == helpers.issuedigits(ofv['Issue_Number']) or all([temploc is None, helpers.issuedigits(ofv['Issue_Number']) == '1']):
if watchmatch['sub']: if watchmatch['sub']:
clocation = os.path.join(watchmatch['comiclocation'], watchmatch['sub'], helpers.conversion(watchmatch['comicfilename'])) clocation = os.path.join(watchmatch['comiclocation'], watchmatch['sub'], helpers.conversion(watchmatch['comicfilename']))
if not os.path.exists(clocation):
scrubs = re.sub(watchmatch['comiclocation'], '', watchmatch['sub']).strip()
if scrubs[:2] == '//' or scrubs[:2] == '\\':
scrubs = scrubs[1:]
if os.path.exists(scrubs):
logger.fdebug('[MODIFIED CLOCATION] %s' % scrubs)
clocation = scrubs
else: else:
clocation = os.path.join(watchmatch['comiclocation'],helpers.conversion(watchmatch['comicfilename'])) if self.issueid is not None and os.path.isfile(watchmatch['comiclocation']):
clocation = watchmatch['comiclocation']
else:
clocation = os.path.join(watchmatch['comiclocation'],helpers.conversion(watchmatch['comicfilename']))
oneoff_issuelist.append({"ComicLocation": clocation, oneoff_issuelist.append({"ComicLocation": clocation,
"ComicID": ofv['ComicID'], "ComicID": ofv['ComicID'],
"IssueID": ofv['IssueID'], "IssueID": ofv['IssueID'],
@ -1296,124 +1318,130 @@ class PostProcessor(object):
logger.fdebug('%s There are %s files found that match on your watchlist, %s files are considered one-off\'s, and %s files do not match anything' % (module, len(manual_list), len(oneoff_issuelist), int(filelist['comiccount']) - len(manual_list))) logger.fdebug('%s There are %s files found that match on your watchlist, %s files are considered one-off\'s, and %s files do not match anything' % (module, len(manual_list), len(oneoff_issuelist), int(filelist['comiccount']) - len(manual_list)))
delete_arc = [] delete_arc = []
if len(manual_arclist) > 0: # and mylar.CONFIG.copy2arcdir is True: if len(manual_arclist) > 0:
logger.info('[STORY-ARC MANUAL POST-PROCESSING] I have found %s issues that belong to Story Arcs. Flinging them into the correct directories.' % len(manual_arclist)) logger.info('[STORY-ARC MANUAL POST-PROCESSING] I have found %s issues that belong to Story Arcs. Flinging them into the correct directories.' % len(manual_arclist))
for ml in manual_arclist: for ml in manual_arclist:
issueid = ml['IssueID'] issueid = ml['IssueID']
ofilename = orig_filename = ml['ComicLocation'] ofilename = orig_filename = ml['ComicLocation']
logger.info('[STORY-ARC POST-PROCESSING] Enabled for %s' % ml['StoryArc']) logger.info('[STORY-ARC POST-PROCESSING] Enabled for %s' % ml['StoryArc'])
grdst = helpers.arcformat(ml['StoryArc'], helpers.spantheyears(ml['StoryArcID']), ml['Publisher']) if all([mylar.CONFIG.STORYARCDIR is True, mylar.CONFIG.COPY2ARCDIR is True]):
logger.info('grdst: %s' % grdst) grdst = helpers.arcformat(ml['StoryArc'], helpers.spantheyears(ml['StoryArcID']), ml['Publisher'])
logger.info('grdst: %s' % grdst)
#tag the meta. #tag the meta.
metaresponse = None metaresponse = None
crcvalue = helpers.crc(ofilename) crcvalue = helpers.crc(ofilename)
if mylar.CONFIG.ENABLE_META: if mylar.CONFIG.ENABLE_META:
logger.info('[STORY-ARC POST-PROCESSING] Metatagging enabled - proceeding...') logger.info('[STORY-ARC POST-PROCESSING] Metatagging enabled - proceeding...')
try: try:
import cmtagmylar import cmtagmylar
metaresponse = cmtagmylar.run(self.nzb_folder, issueid=issueid, filename=ofilename) metaresponse = cmtagmylar.run(self.nzb_folder, issueid=issueid, filename=ofilename)
except ImportError: except ImportError:
logger.warn('%s comictaggerlib not found on system. Ensure the ENTIRE lib directory is located within mylar/lib/comictaggerlib/' % module) logger.warn('%s comictaggerlib not found on system. Ensure the ENTIRE lib directory is located within mylar/lib/comictaggerlib/' % module)
metaresponse = "fail" metaresponse = "fail"
if metaresponse == "fail": if metaresponse == "fail":
logger.fdebug('%s Unable to write metadata successfully - check mylar.log file. Attempting to continue without metatagging...' % module) logger.fdebug('%s Unable to write metadata successfully - check mylar.log file. Attempting to continue without metatagging...' % module)
elif any([metaresponse == "unrar error", metaresponse == "corrupt"]): elif any([metaresponse == "unrar error", metaresponse == "corrupt"]):
logger.error('%s This is a corrupt archive - whether CRC errors or it is incomplete. Marking as BAD, and retrying it.' % module) logger.error('%s This is a corrupt archive - whether CRC errors or it is incomplete. Marking as BAD, and retrying it.' % module)
continue continue
#launch failed download handling here. #launch failed download handling here.
elif metaresponse.startswith('file not found'): elif metaresponse.startswith('file not found'):
filename_in_error = metaresponse.split('||')[1] filename_in_error = metaresponse.split('||')[1]
self._log("The file cannot be found in the location provided for metatagging to be used [%s]. Please verify it exists, and re-run if necessary. Attempting to continue without metatagging..." % (filename_in_error)) self._log("The file cannot be found in the location provided for metatagging to be used [%s]. Please verify it exists, and re-run if necessary. Attempting to continue without metatagging..." % (filename_in_error))
logger.error('%s The file cannot be found in the location provided for metatagging to be used [%s]. Please verify it exists, and re-run if necessary. Attempting to continue without metatagging...' % (module, filename_in_error)) logger.error('%s The file cannot be found in the location provided for metatagging to be used [%s]. Please verify it exists, and re-run if necessary. Attempting to continue without metatagging...' % (module, filename_in_error))
else:
odir = os.path.split(metaresponse)[0]
ofilename = os.path.split(metaresponse)[1]
ext = os.path.splitext(metaresponse)[1]
logger.info('%s Sucessfully wrote metadata to .cbz (%s) - Continuing..' % (module, ofilename))
self._log('Sucessfully wrote metadata to .cbz (%s) - proceeding...' % ofilename)
dfilename = ofilename
else: else:
odir = os.path.split(metaresponse)[0] dfilename = ml['Filename']
ofilename = os.path.split(metaresponse)[1]
ext = os.path.splitext(metaresponse)[1]
logger.info('%s Sucessfully wrote metadata to .cbz (%s) - Continuing..' % (module, ofilename))
self._log('Sucessfully wrote metadata to .cbz (%s) - proceeding...' % ofilename)
dfilename = ofilename
if metaresponse:
src_location = odir
grab_src = os.path.join(src_location, ofilename)
else:
src_location = ofilename
grab_src = ofilename
logger.fdebug('%s Source Path : %s' % (module, grab_src))
checkdirectory = filechecker.validateAndCreateDirectory(grdst, True, module=module)
if not checkdirectory:
logger.warn('%s Error trying to validate/create directory. Aborting this process at this time.' % module)
self.valreturn.append({"self.log": self.log,
"mode": 'stop'})
return self.queue.put(self.valreturn)
#send to renamer here if valid.
if mylar.CONFIG.RENAME_FILES:
renamed_file = helpers.rename_param(ml['ComicID'], ml['ComicName'], ml['IssueNumber'], dfilename, issueid=ml['IssueID'], arc=ml['StoryArc'])
if renamed_file:
dfilename = renamed_file['nfilename']
logger.fdebug('%s Renaming file to conform to configuration: %s' % (module, ofilename))
#if from a StoryArc, check to see if we're appending the ReadingOrder to the filename
if mylar.CONFIG.READ2FILENAME:
logger.fdebug('%s readingorder#: %s' % (module, ml['ReadingOrder']))
if int(ml['ReadingOrder']) < 10: readord = "00" + str(ml['ReadingOrder'])
elif int(ml['ReadingOrder']) >= 10 and int(ml['ReadingOrder']) <= 99: readord = "0" + str(ml['ReadingOrder'])
else: readord = str(ml['ReadingOrder'])
dfilename = str(readord) + "-" + os.path.split(dfilename)[1]
grab_dst = os.path.join(grdst, dfilename)
logger.fdebug('%s Destination Path : %s' % (module, grab_dst))
logger.fdebug('%s Source Path : %s' % (module, grab_src))
logger.info('%s[ONE-OFF MODE][%s] %s into directory : %s' % (module, mylar.CONFIG.ARC_FILEOPS.upper(), grab_src, grab_dst))
#this is also for issues that are part of a story arc, and don't belong to a watchlist series (ie. one-off's)
try:
checkspace = helpers.get_free_space(grdst)
if checkspace is False:
if all([metaresponse is not None, metaresponse != 'fail']): # meta was done
self.tidyup(src_location, True, cacheonly=True)
raise OSError
fileoperation = helpers.file_ops(grab_src, grab_dst, one_off=True)
if not fileoperation:
raise OSError
except Exception as e:
logger.error('%s [ONE-OFF MODE] Failed to %s %s: %s' % (module, mylar.CONFIG.ARC_FILEOPS, grab_src, e))
return
#tidyup old path
if any([mylar.CONFIG.FILE_OPTS == 'move', mylar.CONFIG.FILE_OPTS == 'copy']):
self.tidyup(src_location, True, filename=orig_filename)
#delete entry from nzblog table
#if it was downloaded via mylar from the storyarc section, it will have an 'S' in the nzblog
#if it was downloaded outside of mylar and/or not from the storyarc section, it will be a normal issueid in the nzblog
#IssArcID = 'S' + str(ml['IssueArcID'])
myDB.action('DELETE from nzblog WHERE IssueID=? AND SARC=?', ['S' + str(ml['IssueArcID']),ml['StoryArc']])
myDB.action('DELETE from nzblog WHERE IssueID=? AND SARC=?', [ml['IssueArcID'],ml['StoryArc']])
logger.fdebug('%s IssueArcID: %s' % (module, ml['IssueArcID']))
newVal = {"Status": "Downloaded",
"Location": grab_dst}
else: else:
dfilename = ml['Filename'] newVal = {"Status": "Downloaded",
"Location": ml['ComicLocation']}
if metaresponse:
src_location = odir
grab_src = os.path.join(src_location, ofilename)
else:
src_location = ofilename
grab_src = ofilename
logger.fdebug('%s Source Path : %s' % (module, grab_src))
checkdirectory = filechecker.validateAndCreateDirectory(grdst, True, module=module)
if not checkdirectory:
logger.warn('%s Error trying to validate/create directory. Aborting this process at this time.' % module)
self.valreturn.append({"self.log": self.log,
"mode": 'stop'})
return self.queue.put(self.valreturn)
#send to renamer here if valid.
if mylar.CONFIG.RENAME_FILES:
renamed_file = helpers.rename_param(ml['ComicID'], ml['ComicName'], ml['IssueNumber'], dfilename, issueid=ml['IssueID'], arc=ml['StoryArc'])
if renamed_file:
dfilename = renamed_file['nfilename']
logger.fdebug('%s Renaming file to conform to configuration: %s' % (module, ofilename))
#if from a StoryArc, check to see if we're appending the ReadingOrder to the filename
if mylar.CONFIG.READ2FILENAME:
logger.fdebug('%s readingorder#: %s' % (module, ml['ReadingOrder']))
if int(ml['ReadingOrder']) < 10: readord = "00" + str(ml['ReadingOrder'])
elif int(ml['ReadingOrder']) >= 10 and int(ml['ReadingOrder']) <= 99: readord = "0" + str(ml['ReadingOrder'])
else: readord = str(ml['ReadingOrder'])
dfilename = str(readord) + "-" + os.path.split(dfilename)[1]
grab_dst = os.path.join(grdst, dfilename)
logger.fdebug('%s Destination Path : %s' % (module, grab_dst))
logger.fdebug('%s Source Path : %s' % (module, grab_src))
logger.info('%s[ONE-OFF MODE][%s] %s into directory : %s' % (module, mylar.CONFIG.ARC_FILEOPS.upper(), grab_src, grab_dst))
#this is also for issues that are part of a story arc, and don't belong to a watchlist series (ie. one-off's)
try:
checkspace = helpers.get_free_space(grdst)
if checkspace is False:
if all([metaresponse is not None, metaresponse != 'fail']): # meta was done
self.tidyup(src_location, True, cacheonly=True)
raise OSError
fileoperation = helpers.file_ops(grab_src, grab_dst, one_off=True)
if not fileoperation:
raise OSError
except Exception as e:
logger.error('%s [ONE-OFF MODE] Failed to %s %s: %s' % (module, mylar.CONFIG.ARC_FILEOPS, grab_src, e))
return
#tidyup old path
if any([mylar.CONFIG.FILE_OPTS == 'move', mylar.CONFIG.FILE_OPTS == 'copy']):
self.tidyup(src_location, True, filename=orig_filename)
#delete entry from nzblog table
#if it was downloaded via mylar from the storyarc section, it will have an 'S' in the nzblog
#if it was downloaded outside of mylar and/or not from the storyarc section, it will be a normal issueid in the nzblog
#IssArcID = 'S' + str(ml['IssueArcID'])
myDB.action('DELETE from nzblog WHERE IssueID=? AND SARC=?', ['S' + str(ml['IssueArcID']),ml['StoryArc']])
myDB.action('DELETE from nzblog WHERE IssueID=? AND SARC=?', [ml['IssueArcID'],ml['StoryArc']])
logger.fdebug('%s IssueArcID: %s' % (module, ml['IssueArcID']))
ctrlVal = {"IssueArcID": ml['IssueArcID']} ctrlVal = {"IssueArcID": ml['IssueArcID']}
newVal = {"Status": "Downloaded",
"Location": grab_dst}
logger.fdebug('writing: %s -- %s' % (newVal, ctrlVal)) logger.fdebug('writing: %s -- %s' % (newVal, ctrlVal))
myDB.upsert("storyarcs", newVal, ctrlVal) myDB.upsert("storyarcs", newVal, ctrlVal)
if all([mylar.CONFIG.STORYARCDIR is True, mylar.CONFIG.COPY2ARCDIR is True]):
logger.fdebug('%s [%s] Post-Processing completed for: %s' % (module, ml['StoryArc'], grab_dst)) logger.fdebug('%s [%s] Post-Processing completed for: %s' % (module, ml['StoryArc'], grab_dst))
else:
logger.fdebug('%s [%s] Post-Processing completed for: %s' % (module, ml['StoryArc'], ml['ComicLocation']))
if (all([self.nzb_name != 'Manual Run', self.apicall is False]) or (self.oneoffinlist is True or all([self.issuearcid is not None, self.issueid is None]))) and not self.nzb_name.startswith('0-Day'): # and all([self.issueid is None, self.comicid is None, self.apicall is False]): if (all([self.nzb_name != 'Manual Run', self.apicall is False]) or (self.oneoffinlist is True or all([self.issuearcid is not None, self.issueid is None]))) and not self.nzb_name.startswith('0-Day'): # and all([self.issueid is None, self.comicid is None, self.apicall is False]):
ppinfo = [] ppinfo = []
@ -2744,10 +2772,6 @@ class PostProcessor(object):
prowl = notifiers.PROWL() prowl = notifiers.PROWL()
prowl.notify(pushmessage, "Download and Postprocessing completed", module=module) prowl.notify(pushmessage, "Download and Postprocessing completed", module=module)
if mylar.CONFIG.NMA_ENABLED:
nma = notifiers.NMA()
nma.notify(prline=prline, prline2=prline2, module=module)
if mylar.CONFIG.PUSHOVER_ENABLED: if mylar.CONFIG.PUSHOVER_ENABLED:
pushover = notifiers.PUSHOVER() pushover = notifiers.PUSHOVER()
pushover.notify(prline, prline2, module=module) pushover.notify(prline, prline2, module=module)
@ -2766,7 +2790,12 @@ class PostProcessor(object):
if mylar.CONFIG.SLACK_ENABLED: if mylar.CONFIG.SLACK_ENABLED:
slack = notifiers.SLACK() slack = notifiers.SLACK()
slack.notify("Download and Postprocessing completed", prline, module=module) slack.notify("Download and Postprocessing completed", prline2, module=module)
if mylar.CONFIG.EMAIL_ENABLED and mylar.CONFIG.EMAIL_ONPOST:
logger.info(u"Sending email notification")
email = notifiers.EMAIL()
email.notify(prline2, "Mylar notification - Processed", module=module)
return return

View File

@ -27,7 +27,7 @@ from cgi import escape
import urllib import urllib
import re import re
import mylar import mylar
from mylar import logger from mylar import logger, encrypted
SESSION_KEY = '_cp_username' SESSION_KEY = '_cp_username'
@ -37,10 +37,18 @@ def check_credentials(username, password):
# Adapt to your needs # Adapt to your needs
forms_user = cherrypy.request.config['auth.forms_username'] forms_user = cherrypy.request.config['auth.forms_username']
forms_pass = cherrypy.request.config['auth.forms_password'] forms_pass = cherrypy.request.config['auth.forms_password']
if username == forms_user and password == forms_pass: edc = encrypted.Encryptor(forms_pass)
return None ed_chk = edc.decrypt_it()
if mylar.CONFIG.ENCRYPT_PASSWORDS is True:
if username == forms_user and all([ed_chk['status'] is True, ed_chk['password'] == password]):
return None
else:
return u"Incorrect username or password."
else: else:
return u"Incorrect username or password." if username == forms_user and password == forms_pass:
return None
else:
return u"Incorrect username or password."
def check_auth(*args, **kwargs): def check_auth(*args, **kwargs):
"""A tool that looks in config for 'auth.require'. If found and it """A tool that looks in config for 'auth.require'. If found and it

View File

@ -10,7 +10,7 @@ import threading
import re import re
import ConfigParser import ConfigParser
import mylar import mylar
from mylar import logger, helpers from mylar import logger, helpers, encrypted
config = ConfigParser.SafeConfigParser() config = ConfigParser.SafeConfigParser()
@ -78,6 +78,7 @@ _CONFIG_DEFINITIONS = OrderedDict({
'FORMAT_BOOKTYPE': (bool, 'General', False), 'FORMAT_BOOKTYPE': (bool, 'General', False),
'CLEANUP_CACHE': (bool, 'General', False), 'CLEANUP_CACHE': (bool, 'General', False),
'SECURE_DIR': (str, 'General', None), 'SECURE_DIR': (str, 'General', None),
'ENCRYPT_PASSWORDS': (bool, 'General', False),
'RSS_CHECKINTERVAL': (int, 'Scheduler', 20), 'RSS_CHECKINTERVAL': (int, 'Scheduler', 20),
'SEARCH_INTERVAL': (int, 'Scheduler', 360), 'SEARCH_INTERVAL': (int, 'Scheduler', 360),
@ -154,11 +155,6 @@ _CONFIG_DEFINITIONS = OrderedDict({
'PROWL_KEYS': (str, 'Prowl', None), 'PROWL_KEYS': (str, 'Prowl', None),
'PROWL_ONSNATCH': (bool, 'Prowl', False), 'PROWL_ONSNATCH': (bool, 'Prowl', False),
'NMA_ENABLED': (bool, 'NMA', False),
'NMA_APIKEY': (str, 'NMA', None),
'NMA_PRIORITY': (int, 'NMA', 0),
'NMA_ONSNATCH': (bool, 'NMA', False),
'PUSHOVER_ENABLED': (bool, 'PUSHOVER', False), 'PUSHOVER_ENABLED': (bool, 'PUSHOVER', False),
'PUSHOVER_PRIORITY': (int, 'PUSHOVER', 0), 'PUSHOVER_PRIORITY': (int, 'PUSHOVER', 0),
'PUSHOVER_APIKEY': (str, 'PUSHOVER', None), 'PUSHOVER_APIKEY': (str, 'PUSHOVER', None),
@ -185,6 +181,17 @@ _CONFIG_DEFINITIONS = OrderedDict({
'SLACK_WEBHOOK_URL': (str, 'SLACK', None), 'SLACK_WEBHOOK_URL': (str, 'SLACK', None),
'SLACK_ONSNATCH': (bool, 'SLACK', False), 'SLACK_ONSNATCH': (bool, 'SLACK', False),
'EMAIL_ENABLED': (bool, 'Email', False),
'EMAIL_FROM': (str, 'Email', ''),
'EMAIL_TO': (str, 'Email', ''),
'EMAIL_SERVER': (str, 'Email', ''),
'EMAIL_USER': (str, 'Email', ''),
'EMAIL_PASSWORD': (str, 'Email', ''),
'EMAIL_PORT': (int, 'Email', 25),
'EMAIL_ENC': (int, 'Email', 0),
'EMAIL_ONGRAB': (bool, 'Email', True),
'EMAIL_ONPOST': (bool, 'Email', True),
'POST_PROCESSING': (bool, 'PostProcess', False), 'POST_PROCESSING': (bool, 'PostProcess', False),
'FILE_OPTS': (str, 'PostProcess', 'move'), 'FILE_OPTS': (str, 'PostProcess', 'move'),
'SNATCHEDTORRENT_NOTIFY': (bool, 'PostProcess', False), 'SNATCHEDTORRENT_NOTIFY': (bool, 'PostProcess', False),
@ -385,7 +392,7 @@ class Config(object):
count = sum(1 for line in open(self._config_file)) count = sum(1 for line in open(self._config_file))
else: else:
count = 0 count = 0
self.newconfig = 9 self.newconfig = 10
if count == 0: if count == 0:
CONFIG_VERSION = 0 CONFIG_VERSION = 0
MINIMALINI = False MINIMALINI = False
@ -505,9 +512,11 @@ class Config(object):
shutil.move(self._config_file, os.path.join(mylar.DATA_DIR, 'config.ini.backup')) shutil.move(self._config_file, os.path.join(mylar.DATA_DIR, 'config.ini.backup'))
except: except:
print('Unable to make proper backup of config file in %s' % os.path.join(mylar.DATA_DIR, 'config.ini.backup')) print('Unable to make proper backup of config file in %s' % os.path.join(mylar.DATA_DIR, 'config.ini.backup'))
if self.CONFIG_VERSION < 9: if self.CONFIG_VERSION < 10:
print('Attempting to update configuration..') print('Attempting to update configuration..')
#torznab multiple entries merged into extra_torznabs value #8-torznab multiple entries merged into extra_torznabs value
#9-remote rtorrent ssl option
#10-encryption of all keys/passwords.
self.config_update() self.config_update()
setattr(self, 'CONFIG_VERSION', str(self.newconfig)) setattr(self, 'CONFIG_VERSION', str(self.newconfig))
config.set('General', 'CONFIG_VERSION', str(self.newconfig)) config.set('General', 'CONFIG_VERSION', str(self.newconfig))
@ -555,7 +564,7 @@ class Config(object):
config.remove_option('Torznab', 'torznab_category') config.remove_option('Torznab', 'torznab_category')
config.remove_option('Torznab', 'torznab_verify') config.remove_option('Torznab', 'torznab_verify')
print('Successfully removed outdated config entries.') print('Successfully removed outdated config entries.')
if self.newconfig == 9: if self.newconfig < 9:
#rejig rtorrent settings due to change. #rejig rtorrent settings due to change.
try: try:
if all([self.RTORRENT_SSL is True, not self.RTORRENT_HOST.startswith('http')]): if all([self.RTORRENT_SSL is True, not self.RTORRENT_HOST.startswith('http')]):
@ -565,6 +574,15 @@ class Config(object):
pass pass
config.remove_option('Rtorrent', 'rtorrent_ssl') config.remove_option('Rtorrent', 'rtorrent_ssl')
print('Successfully removed oudated config entries.') print('Successfully removed oudated config entries.')
if self.newconfig < 10:
#encrypt all passwords / apikeys / usernames in ini file.
#leave non-ini items (ie. memory) as un-encrypted items.
try:
if self.ENCRYPT_PASSWORDS is True:
self.encrypt_items(mode='encrypt', updateconfig=True)
except Exception as e:
print('Error: %s' % e)
print('Successfully updated config to version 10 ( password / apikey - .ini encryption )')
print('Configuration upgraded to version %s' % self.newconfig) print('Configuration upgraded to version %s' % self.newconfig)
def check_section(self, section, key): def check_section(self, section, key):
@ -713,6 +731,10 @@ class Config(object):
else: else:
pass pass
if self.ENCRYPT_PASSWORDS is True:
self.encrypt_items(mode='encrypt')
def writeconfig(self, values=None): def writeconfig(self, values=None):
logger.fdebug("Writing configuration to file") logger.fdebug("Writing configuration to file")
self.provider_sequence() self.provider_sequence()
@ -741,6 +763,74 @@ class Config(object):
except IOError as e: except IOError as e:
logger.warn("Error writing configuration file: %s", e) logger.warn("Error writing configuration file: %s", e)
def encrypt_items(self, mode='encrypt', updateconfig=False):
encryption_list = OrderedDict({
#key section key value
'HTTP_PASSWORD': ('Interface', 'http_password', self.HTTP_PASSWORD),
'SAB_PASSWORD': ('SABnzbd', 'sab_password', self.SAB_PASSWORD),
'SAB_APIKEY': ('SABnzbd', 'sab_apikey', self.SAB_APIKEY),
'NZBGET_PASSWORD': ('NZBGet', 'nzbget_password', self.NZBGET_PASSWORD),
'NZBSU_APIKEY': ('NZBsu', 'nzbsu_apikey', self.NZBSU_APIKEY),
'DOGNZB_APIKEY': ('DOGnzb', 'dognzb_apikey', self.DOGNZB_APIKEY),
'UTORRENT_PASSWORD': ('uTorrent', 'utorrent_password', self.UTORRENT_PASSWORD),
'TRANSMISSION_PASSWORD': ('Transmission', 'transmission_password', self.TRANSMISSION_PASSWORD),
'DELUGE_PASSWORD': ('Deluge', 'deluge_password', self.DELUGE_PASSWORD),
'QBITTORRENT_PASSWORD': ('qBittorrent', 'qbittorrent_password', self.QBITTORRENT_PASSWORD),
'RTORRENT_PASSWORD': ('Rtorrent', 'rtorrent_password', self.RTORRENT_PASSWORD),
'PROWL_KEYS': ('Prowl', 'prowl_keys', self.PROWL_KEYS),
'PUSHOVER_APIKEY': ('PUSHOVER', 'pushover_apikey', self.PUSHOVER_APIKEY),
'PUSHOVER_USERKEY': ('PUSHOVER', 'pushover_userkey', self.PUSHOVER_USERKEY),
'BOXCAR_TOKEN': ('BOXCAR', 'boxcar_token', self.BOXCAR_TOKEN),
'PUSHBULLET_APIKEY': ('PUSHBULLET', 'pushbullet_apikey', self.PUSHBULLET_APIKEY),
'TELEGRAM_TOKEN': ('TELEGRAM', 'telegram_token', self.TELEGRAM_TOKEN),
'COMICVINE_API': ('CV', 'comicvine_api', self.COMICVINE_API),
'PASSWORD_32P': ('32P', 'password_32p', self.PASSWORD_32P),
'PASSKEY_32P': ('32P', 'passkey_32p', self.PASSKEY_32P),
'USERNAME_32P': ('32P', 'username_32p', self.USERNAME_32P),
'SEEDBOX_PASS': ('Seedbox', 'seedbox_pass', self.SEEDBOX_PASS),
'TAB_PASS': ('Tablet', 'tab_pass', self.TAB_PASS),
'API_KEY': ('API', 'api_key', self.API_KEY),
'OPDS_PASSWORD': ('OPDS', 'opds_password', self.OPDS_PASSWORD),
'PP_SSHPASSWD': ('AutoSnatch', 'pp_sshpasswd', self.PP_SSHPASSWD),
})
new_encrypted = 0
for k,v in encryption_list.iteritems():
value = []
for x in v:
value.append(x)
if value[2] is not None:
if value[2][:5] == '^~$z$':
if mode == 'decrypt':
hp = encrypted.Encryptor(value[2])
decrypted_password = hp.decrypt_it()
if decrypted_password['status'] is False:
logger.warn('Password unable to decrypt - you might have to manually edit the ini for %s to reset the value' % value[1])
else:
if k != 'HTTP_PASSWORD':
setattr(self, k, decrypted_password['password'])
config.set(value[0], value[1], decrypted_password['password'])
else:
if k == 'HTTP_PASSWORD':
hp = encrypted.Encryptor(value[2])
decrypted_password = hp.decrypt_it()
if decrypted_password['status'] is False:
logger.warn('Password unable to decrypt - you might have to manually edit the ini for %s to reset the value' % value[1])
else:
setattr(self, k, decrypted_password['password'])
else:
hp = encrypted.Encryptor(value[2])
encrypted_password = hp.encrypt_it()
if encrypted_password['status'] is False:
logger.warn('Unable to encrypt password for %s - it has not been encrypted. Keeping it as it is.' % value[1])
else:
if k == 'HTTP_PASSWORD':
#make sure we set the http_password for signon to the encrypted value otherwise won't match
setattr(self, k, encrypted_password['password'])
config.set(value[0], value[1], encrypted_password['password'])
new_encrypted+=1
def configure(self, update=False, startup=False): def configure(self, update=False, startup=False):
#force alt_pull = 2 on restarts regardless of settings #force alt_pull = 2 on restarts regardless of settings
@ -880,6 +970,9 @@ class Config(object):
elif all([self.HTTP_USERNAME is None, self.HTTP_PASSWORD is None]): elif all([self.HTTP_USERNAME is None, self.HTTP_PASSWORD is None]):
self.AUTHENTICATION = 0 self.AUTHENTICATION = 0
if self.ENCRYPT_PASSWORDS is True:
self.encrypt_items(mode='decrypt')
if all([self.IGNORE_TOTAL is True, self.IGNORE_HAVETOTAL is True]): if all([self.IGNORE_TOTAL is True, self.IGNORE_HAVETOTAL is True]):
self.IGNORE_TOTAL = False self.IGNORE_TOTAL = False
self.IGNORE_HAVETOTAL = False self.IGNORE_HAVETOTAL = False

View File

@ -24,6 +24,7 @@ import lib.feedparser
import mylar import mylar
import platform import platform
from bs4 import BeautifulSoup as Soup from bs4 import BeautifulSoup as Soup
from xml.parsers.expat import ExpatError
import httplib import httplib
import requests import requests
@ -96,10 +97,19 @@ def pulldetails(comicid, type, issueid=None, offset=1, arclist=None, comicidlist
return return
#logger.fdebug('cv status code : ' + str(r.status_code)) #logger.fdebug('cv status code : ' + str(r.status_code))
dom = parseString(r.content) try:
dom = parseString(r.content)
return dom except ExpatError:
if u'<title>Abnormal Traffic Detected' in r.content:
logger.error('ComicVine has banned this server\'s IP address because it exceeded the API rate limit.')
else:
logger.warn('[WARNING] ComicVine is not responding correctly at the moment. This is usually due to some problems on their end. If you re-try things again in a few moments, things might work')
return
except Exception as e:
logger.warn('[ERROR] Error returned from CV: %s' % e)
return
else:
return dom
def getComic(comicid, type, issueid=None, arc=None, arcid=None, arclist=None, comicidlist=None): def getComic(comicid, type, issueid=None, arc=None, arcid=None, arclist=None, comicidlist=None):
if type == 'issue': if type == 'issue':
@ -413,16 +423,16 @@ def GetComicInfo(comicid, dom, safechk=None):
issuerun = issuerun[:srchline+len(x)] issuerun = issuerun[:srchline+len(x)]
break break
except Exception as e: except Exception as e:
logger.warn('[ERROR] %s' % e) #logger.warn('[ERROR] %s' % e)
continue continue
else: else:
iss_start = fc_name.find('#') iss_start = fc_name.find('#')
issuerun = fc_name[iss_start:].strip() issuerun = fc_name[iss_start:].strip()
fc_name = fc_name[:iss_start].strip() fc_name = fc_name[:iss_start].strip()
if issuerun.endswith('.') or issuerun.endswith(','): if issuerun.strip().endswith('.') or issuerun.strip().endswith(','):
#logger.fdebug('Changed issuerun from %s to %s' % (issuerun, issuerun[:-1])) #logger.fdebug('Changed issuerun from %s to %s' % (issuerun, issuerun[:-1]))
issuerun = issuerun[:-1] issuerun = issuerun.strip()[:-1]
if issuerun.endswith(' and '): if issuerun.endswith(' and '):
issuerun = issuerun[:-4].strip() issuerun = issuerun[:-4].strip()
elif issuerun.endswith(' and'): elif issuerun.endswith(' and'):

55
mylar/encrypted.py Normal file
View File

@ -0,0 +1,55 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# This file is part of Mylar.
#
# Mylar is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Mylar is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Mylar. If not, see <http://www.gnu.org/licenses/>.
import random
import base64
import re
import sys
import os
import mylar
from mylar import logger
class Encryptor(object):
def __init__(self, password, chk_password=None):
self.password = password.encode('utf-8')
def encrypt_it(self):
try:
salt = os.urandom(8)
saltedhash = [salt[i] for i in range (0, len(salt))]
salted_pass = base64.b64encode('%s%s' % (self.password,salt))
except Exception as e:
logger.warn('Error when encrypting: %s' % e)
return {'status': False}
else:
return {'status': True, 'password': '^~$z$' + salted_pass}
def decrypt_it(self):
try:
if not self.password.startswith('^~$z$'):
logger.warn('Error not an encryption that I recognize.')
return {'status': False}
passd = base64.b64decode(self.password[5:]) #(base64.decodestring(self.password))
saltedhash = [bytes(passd[-8:])]
except Exception as e:
logger.warn('Error when decrypting password: %s' % e)
return {'status': False}
else:
return {'status': True, 'password': passd[:-8]}

View File

@ -99,7 +99,7 @@ class FileChecker(object):
self.pp_mode = False self.pp_mode = False
self.failed_files = [] self.failed_files = []
self.dynamic_handlers = ['/','-',':','\'',',','&','?','!','+','(',')','\u2014','\u2013'] self.dynamic_handlers = ['/','-',':',';','\'',',','&','?','!','+','(',')','\u2014','\u2013']
self.dynamic_replacements = ['and','the'] self.dynamic_replacements = ['and','the']
self.rippers = ['-empire','-empire-hd','minutemen-','-dcp'] self.rippers = ['-empire','-empire-hd','minutemen-','-dcp']
@ -329,7 +329,8 @@ class FileChecker(object):
ret_sf2 = ' '.join(split_file3) ret_sf2 = ' '.join(split_file3)
sf = re.findall('''\( [^\)]* \) |\[ [^\]]* \] |\S+''', ret_sf2, re.VERBOSE) sf = re.findall('''\( [^\)]* \) |\[ [^\]]* \] |\[ [^\#]* \]|\S+''', ret_sf2, re.VERBOSE)
#sf = re.findall('''\( [^\)]* \) |\[ [^\]]* \] |\S+''', ret_sf2, re.VERBOSE)
ret_sf1 = ' '.join(sf) ret_sf1 = ' '.join(sf)
@ -342,9 +343,8 @@ class FileChecker(object):
ret_sf1 = re.sub('\&', 'f11', ret_sf1).strip() ret_sf1 = re.sub('\&', 'f11', ret_sf1).strip()
ret_sf1 = re.sub('\'', 'g11', ret_sf1).strip() ret_sf1 = re.sub('\'', 'g11', ret_sf1).strip()
#split_file = re.findall('\([\w\s-]+\)|[-+]?\d*\.\d+|\d+|[\w-]+|#?\d\.\d+|#(?<![\w\d])XCV(?![\w\d])+|\)', ret_sf1, re.UNICODE) #split_file = re.findall('(?imu)\([\w\s-]+\)|[-+]?\d*\.\d+|\d+|[\w-]+|#?\d\.\d+|#(?<![\w\d])XCV(?![\w\d])+|\)', ret_sf1, re.UNICODE)
split_file = re.findall('(?imu)\([\w\s-]+\)|[-+]?\d*\.\d+|\d+|[\w-]+|#?\d\.\d+|#(?<![\w\d])XCV(?![\w\d])+|\)', ret_sf1, re.UNICODE) split_file = re.findall('(?imu)\([\w\s-]+\)|[-+]?\d*\.\d+|\d+[\s]COVERS+|\d{4}-\d{2}-\d{2}|\d+[(th|nd|rd|st)]+|\d+|[\w-]+|#?\d\.\d+|#[\.-]\w+|#[\d*\.\d+|\w+\d+]+|#(?<![\w\d])XCV(?![\w\d])+|#[\w+]|\)', ret_sf1, re.UNICODE)
#10-20-2018 ---START -- attempt to detect '01 (of 7.3)' #10-20-2018 ---START -- attempt to detect '01 (of 7.3)'
#10-20-2018 -- attempt to detect '36p ctc' as one element #10-20-2018 -- attempt to detect '36p ctc' as one element
spf = [] spf = []
@ -358,6 +358,8 @@ class FileChecker(object):
mini = False mini = False
try: try:
logger.fdebug('checking now: %s' % x) logger.fdebug('checking now: %s' % x)
if x.lower() == 'infinity':
raise Exception
if x.isdigit(): if x.isdigit():
logger.fdebug('[MINI-SERIES] MAX ISSUES IN SERIES: %s' % x) logger.fdebug('[MINI-SERIES] MAX ISSUES IN SERIES: %s' % x)
spf.append('(of %s)' % x) spf.append('(of %s)' % x)
@ -505,6 +507,12 @@ class FileChecker(object):
logger.fdebug('Issue Number SHOULD BE: ' + str(lastissue_label)) logger.fdebug('Issue Number SHOULD BE: ' + str(lastissue_label))
validcountchk = True validcountchk = True
match2 = re.search('(\d+[\s])covers', sf, re.IGNORECASE)
if match2:
num_covers = re.sub('[^0-9]', '', match2.group()).strip()
#logger.fdebug('%s covers detected within filename' % num_covers)
continue
if all([lastissue_position == (split_file.index(sf) -1), lastissue_label is not None, '#' not in sf, sf != 'p']): if all([lastissue_position == (split_file.index(sf) -1), lastissue_label is not None, '#' not in sf, sf != 'p']):
#find it in the original file to see if there's a decimal between. #find it in the original file to see if there's a decimal between.
findst = lastissue_mod_position+1 findst = lastissue_mod_position+1
@ -594,6 +602,16 @@ class FileChecker(object):
try: try:
volume_found['position'] = split_file.index(volumeprior_label, current_pos -1) #if this passes, then we're ok, otherwise will try exception volume_found['position'] = split_file.index(volumeprior_label, current_pos -1) #if this passes, then we're ok, otherwise will try exception
logger.fdebug('volume_found: ' + str(volume_found['position'])) logger.fdebug('volume_found: ' + str(volume_found['position']))
#remove volume numeric from split_file
split_file.pop(volume_found['position'])
split_file.pop(split_file.index(sf, current_pos-1))
#join the previous label to the volume numeric
#volume = str(volumeprior_label) + str(volume)
#insert the combined info back
split_file.insert(volume_found['position'], volumeprior_label + volume)
split_file.insert(volume_found['position']+1, '')
#volume_found['position'] = split_file.index(sf, current_pos)
#logger.fdebug('NEWSPLITFILE: %s' % split_file)
except: except:
volumeprior = False volumeprior = False
volumeprior_label = None volumeprior_label = None
@ -603,10 +621,10 @@ class FileChecker(object):
volume_found['position'] = split_file.index(sf, current_pos) volume_found['position'] = split_file.index(sf, current_pos)
volume_found['volume'] = volume volume_found['volume'] = volume
logger.fdebug('volume label detected as : Volume ' + str(volume) + ' @ position: ' + str(split_file.index(sf))) logger.fdebug('volume label detected as : Volume %s @ position: %s' % (volume, volume_found['position']))
volumeprior = False volumeprior = False
volumeprior_label = None volumeprior_label = None
elif 'vol' in sf.lower() and len(sf) == 3: elif all(['vol' in sf.lower(), len(sf) == 3]) or all(['vol.' in sf.lower(), len(sf) == 4]):
#if there's a space between the vol and # - adjust. #if there's a space between the vol and # - adjust.
volumeprior = True volumeprior = True
volumeprior_label = sf volumeprior_label = sf
@ -772,14 +790,15 @@ class FileChecker(object):
yearposition = possible_years[0]['yearposition'] yearposition = possible_years[0]['yearposition']
yearmodposition = possible_years[0]['yearmodposition'] yearmodposition = possible_years[0]['yearmodposition']
else: else:
for x in possible_years: if len(possible_issuenumbers) > 0:
logger.info('yearposition[%s] -- dc[position][%s]' % (yearposition, x['yearposition'])) for x in possible_years:
if yearposition < x['yearposition']: logger.info('yearposition[%s] -- dc[position][%s]' % (yearposition, x['yearposition']))
if all([len(possible_issuenumbers) == 1, possible_issuenumbers[0]['number'] == x['year'], x['yearposition'] != possible_issuenumbers[0]['position']]): if yearposition < x['yearposition']:
issue2year = True if all([len(possible_issuenumbers) == 1, possible_issuenumbers[0]['number'] == x['year'], x['yearposition'] != possible_issuenumbers[0]['position']]):
highest_series_pos = x['yearposition'] issue2year = True
yearposition = x['yearposition'] highest_series_pos = x['yearposition']
yearmodposition = x['yearmodposition'] yearposition = x['yearposition']
yearmodposition = x['yearmodposition']
if highest_series_pos > yearposition: highest_series_pos = yearposition #dc['position']: highest_series_pos = dc['position'] if highest_series_pos > yearposition: highest_series_pos = yearposition #dc['position']: highest_series_pos = dc['position']
else: else:
@ -814,7 +833,7 @@ class FileChecker(object):
logger.fdebug('Numeric detected as the last digit after a hyphen. Typically this is the issue number.') logger.fdebug('Numeric detected as the last digit after a hyphen. Typically this is the issue number.')
if pis['position'] != yearposition: if pis['position'] != yearposition:
issue_number = pis['number'] issue_number = pis['number']
logger.info('Issue set to: ' + str(issue_number)) #logger.info('Issue set to: ' + str(issue_number))
issue_number_position = pis['position'] issue_number_position = pis['position']
if highest_series_pos > pis['position']: highest_series_pos = pis['position'] if highest_series_pos > pis['position']: highest_series_pos = pis['position']
#break #break
@ -916,10 +935,11 @@ class FileChecker(object):
if split_file[issue_number_position -1].lower() == 'annual' or split_file[issue_number_position -1].lower() == 'special': if split_file[issue_number_position -1].lower() == 'annual' or split_file[issue_number_position -1].lower() == 'special':
highest_series_pos = issue_number_position highest_series_pos = issue_number_position
else: else:
if volume_found['position'] < issue_number_position: highest_series_pos = issue_number_position - 1
highest_series_pos = issue_number_position - 1 #if volume_found['position'] < issue_number_position:
else: # highest_series_pos = issue_number_position - 1
highest_series_pos = issue_number_position #else:
# highest_series_pos = issue_number_position
#make sure if we have multiple years detected, that the right one gets picked for the actual year vs. series title #make sure if we have multiple years detected, that the right one gets picked for the actual year vs. series title
if len(possible_years) > 1: if len(possible_years) > 1:
@ -1016,6 +1036,12 @@ class FileChecker(object):
#c1 = '+' #c1 = '+'
#series_name = ' '.join(split_file[:highest_series_pos]) #series_name = ' '.join(split_file[:highest_series_pos])
if yearposition != 0: if yearposition != 0:
if yearposition is not None and yearposition < highest_series_pos:
if yearposition+1 == highest_series_pos:
highest_series_pos = yearposition
else:
if split_file[yearposition+1] == '-' and yearposition+2 == highest_series_pos:
highest_series_pos = yearposition
series_name = ' '.join(split_file[:highest_series_pos]) series_name = ' '.join(split_file[:highest_series_pos])
else: else:
if highest_series_pos <= issue_number_position and all([len(split_file[0]) == 4, split_file[0].isdigit()]): if highest_series_pos <= issue_number_position and all([len(split_file[0]) == 4, split_file[0].isdigit()]):

View File

@ -1610,7 +1610,9 @@ def image_it(comicid, latestissueid, comlocation, ComicImage):
if mylar.CONFIG.ENFORCE_PERMS: if mylar.CONFIG.ENFORCE_PERMS:
filechecker.setperms(comiclocal) filechecker.setperms(comiclocal)
except IOError as e: except IOError as e:
logger.error('Unable to save cover into series directory (%s) at this time' % comiclocal) logger.error('[%s] Error saving cover into series directory (%s) at this time' % (e, comiclocal))
except Exception as e:
logger.error('[%s] Unable to save cover into series directory (%s) at this time' % (e, comiclocal))
myDB = db.DBConnection() myDB = db.DBConnection()
myDB.upsert('comics', {'ComicImage': ComicImage}, {'ComicID': comicid}) myDB.upsert('comics', {'ComicImage': ComicImage}, {'ComicID': comicid})

View File

@ -21,6 +21,7 @@ import threading
import platform import platform
import urllib, urllib2 import urllib, urllib2
from xml.dom.minidom import parseString, Element from xml.dom.minidom import parseString, Element
from xml.parsers.expat import ExpatError
import requests import requests
import mylar import mylar
@ -70,8 +71,8 @@ def pullsearch(comicapi, comicquery, offset, type):
try: try:
r = requests.get(PULLURL, params=payload, verify=mylar.CONFIG.CV_VERIFY, headers=mylar.CV_HEADERS) r = requests.get(PULLURL, params=payload, verify=mylar.CONFIG.CV_VERIFY, headers=mylar.CV_HEADERS)
except Exception, e: except Exception as e:
logger.warn('Error fetching data from ComicVine: %s' % (e)) logger.warn('Error fetching data from ComicVine: %s' % e)
return return
try: try:
@ -82,8 +83,11 @@ def pullsearch(comicapi, comicquery, offset, type):
else: else:
logger.warn('[WARNING] ComicVine is not responding correctly at the moment. This is usually due to some problems on their end. If you re-try things again in a few moments, it might work properly.') logger.warn('[WARNING] ComicVine is not responding correctly at the moment. This is usually due to some problems on their end. If you re-try things again in a few moments, it might work properly.')
return return
except Exception as e:
return dom logger.warn('[ERROR] Error returned from CV: %s' % e)
return
else:
return dom
def findComic(name, mode, issue, limityear=None, type=None): def findComic(name, mode, issue, limityear=None, type=None):
@ -459,24 +463,20 @@ def storyarcinfo(xmlid):
try: try:
r = requests.get(ARCPULL_URL, params=payload, verify=mylar.CONFIG.CV_VERIFY, headers=mylar.CV_HEADERS) r = requests.get(ARCPULL_URL, params=payload, verify=mylar.CONFIG.CV_VERIFY, headers=mylar.CV_HEADERS)
except Exception, e: except Exception as e:
logger.warn('Error fetching data from ComicVine: %s' % (e)) logger.warn('While parsing data from ComicVine, got exception: %s' % e)
return return
# try:
# file = urllib2.urlopen(ARCPULL_URL)
# except urllib2.HTTPError, err:
# logger.error('err : ' + str(err))
# logger.error('There was a major problem retrieving data from ComicVine - on their end.')
# return
# arcdata = file.read()
# file.close()
try: try:
arcdom = parseString(r.content) #(arcdata) arcdom = parseString(r.content)
except ExpatError: except ExpatError:
if u'<title>Abnormal Traffic Detected' in r.content: if u'<title>Abnormal Traffic Detected' in r.content:
logger.error("ComicVine has banned this server's IP address because it exceeded the API rate limit.") logger.error('ComicVine has banned this server\'s IP address because it exceeded the API rate limit.')
else: else:
logger.warn('While parsing data from ComicVine, got exception: %s for data: %s' % (str(e), r.content)) logger.warn('While parsing data from ComicVine, got exception: %s for data: %s' % (e, r.content))
return
except Exception as e:
logger.warn('While parsing data from ComicVine, got exception: %s for data: %s' % (e, r.content))
return return
try: try:

View File

@ -27,6 +27,9 @@ import time
import simplejson import simplejson
import json import json
import requests import requests
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
# This was obviously all taken from headphones with great appreciation :) # This was obviously all taken from headphones with great appreciation :)
@ -80,127 +83,6 @@ class PROWL:
def test_notify(self): def test_notify(self):
self.notify('ZOMG Lazors Pewpewpew!', 'Test Message') self.notify('ZOMG Lazors Pewpewpew!', 'Test Message')
class NMA:
def __init__(self, test_apikey=None):
self.NMA_URL = "https://www.notifymyandroid.com/publicapi/notify"
self.TEST_NMA_URL = "https://www.notifymyandroid.com/publicapi/verify"
if test_apikey is None:
self.apikey = mylar.CONFIG.NMA_APIKEY
self.test = False
else:
self.apikey = test_apikey
self.test = True
self.priority = mylar.CONFIG.NMA_PRIORITY
self._session = requests.Session()
def _send(self, data, module):
try:
r = self._session.post(self.NMA_URL, data=data, verify=True)
except requests.exceptions.RequestException as e:
logger.error(module + '[' + str(e) + '] Unable to send via NMA. Aborting notification for this item.')
return {'status': False,
'message': str(e)}
logger.fdebug('[NMA] Status code returned: ' + str(r.status_code))
if r.status_code == 200:
from xml.dom.minidom import parseString
dom = parseString(r.content)
try:
success_info = dom.getElementsByTagName('success')
success_code = success_info[0].getAttribute('code')
except:
error_info = dom.getElementsByTagName('error')
error_code = error_info[0].getAttribute('code')
error_message = error_info[0].childNodes[0].nodeValue
logger.info(module + '[' + str(error_code) + '] ' + error_message)
return {'status': False,
'message': '[' + str(error_code) + '] ' + error_message}
else:
if self.test is True:
logger.info(module + '[' + str(success_code) + '] NotifyMyAndroid apikey valid. Test notification sent successfully.')
else:
logger.info(module + '[' + str(success_code) + '] NotifyMyAndroid notification sent successfully.')
return {'status': True,
'message': 'APIKEY verified OK / notification sent'}
elif r.status_code >= 400 and r.status_code < 500:
logger.error(module + ' NotifyMyAndroid request failed: %s' % r.content)
return {'status': False,
'message': 'APIKEY verified OK / failure to send notification'}
else:
logger.error(module + ' NotifyMyAndroid notification failed serverside.')
return {'status': False,
'message': 'APIKEY verified OK / failure to send notification'}
def notify(self, snline=None, prline=None, prline2=None, snatched_nzb=None, sent_to=None, prov=None, module=None):
if module is None:
module = ''
module += '[NOTIFIER]'
apikey = self.apikey
priority = self.priority
if snatched_nzb:
if snatched_nzb[-1] == '\.': snatched_nzb = snatched_nzb[:-1]
event = snline
description = "Mylar has snatched: " + snatched_nzb + " from " + prov + " and " + sent_to
else:
event = prline
description = prline2
data = {'apikey': apikey, 'application': 'Mylar', 'event': event.encode('utf-8'), 'description': description.encode('utf-8'), 'priority': priority}
logger.info(module + ' Sending notification request to NotifyMyAndroid')
request = self._send(data, module)
if not request:
logger.warn(module + ' Error sending notification request to NotifyMyAndroid')
def test_notify(self):
module = '[TEST-NOTIFIER]'
try:
r = self._session.get(self.TEST_NMA_URL, params={'apikey': self.apikey}, verify=True)
except requests.exceptions.RequestException as e:
logger.error(module + '[' + str(e) + '] Unable to send via NMA. Aborting test notification - something is probably wrong...')
return {'status': False,
'message': str(e)}
logger.fdebug('[NMA] Status code returned: ' + str(r.status_code))
if r.status_code == 200:
from xml.dom.minidom import parseString
dom = parseString(r.content)
try:
success_info = dom.getElementsByTagName('success')
success_code = success_info[0].getAttribute('code')
except:
error_info = dom.getElementsByTagName('error')
error_code = error_info[0].getAttribute('code')
error_message = error_info[0].childNodes[0].nodeValue
logger.info(module + '[' + str(error_code) + '] ' + error_message)
return {'status': False,
'message': '[' + str(error_code) + '] ' + error_message}
else:
logger.info(module + '[' + str(success_code) + '] NotifyMyAndroid apikey valid. Testing notification service with it.')
elif r.status_code >= 400 and r.status_code < 500:
logger.error(module + ' NotifyMyAndroid request failed: %s' % r.content)
return {'status': False,
'message': 'Unable to send request to NMA - check your connection.'}
else:
logger.error(module + ' NotifyMyAndroid notification failed serverside.')
return {'status': False,
'message': 'Internal Server Error. Try again later.'}
event = 'Test Message'
description = 'ZOMG Lazors PewPewPew!'
data = {'apikey': self.apikey, 'application': 'Mylar', 'event': event.encode('utf-8'), 'description': description.encode('utf-8'), 'priority': 2}
return self._send(data,'[NOTIFIER]')
# 2013-04-01 Added Pushover.net notifications, based on copy of Prowl class above. # 2013-04-01 Added Pushover.net notifications, based on copy of Prowl class above.
# No extra care has been put into API friendliness at the moment (read: https://pushover.net/api#friendly) # No extra care has been put into API friendliness at the moment (read: https://pushover.net/api#friendly)
class PUSHOVER: class PUSHOVER:
@ -479,6 +361,53 @@ class TELEGRAM:
def test_notify(self): def test_notify(self):
return self.notify('Test Message: Release the Ninjas!') return self.notify('Test Message: Release the Ninjas!')
class EMAIL:
def __init__(self, test_emailfrom=None, test_emailto=None, test_emailsvr=None, test_emailport=None, test_emailuser=None, test_emailpass=None, test_emailenc=None):
self.emailfrom = mylar.CONFIG.EMAIL_FROM if test_emailfrom is None else test_emailfrom
self.emailto = mylar.CONFIG.EMAIL_TO if test_emailto is None else test_emailto
self.emailsvr = mylar.CONFIG.EMAIL_SERVER if test_emailsvr is None else test_emailsvr
self.emailport = mylar.CONFIG.EMAIL_PORT if test_emailport is None else test_emailport
self.emailuser = mylar.CONFIG.EMAIL_USER if test_emailuser is None else test_emailuser
self.emailpass = mylar.CONFIG.EMAIL_PASSWORD if test_emailpass is None else test_emailpass
self.emailenc = mylar.CONFIG.EMAIL_ENC if test_emailenc is None else int(test_emailenc)
def notify(self, message, subject, module=None):
if module is None:
module = ''
module += '[NOTIFIER]'
sent_successfully = False
try:
logger.debug(module + u' Sending email notification. From: [%s] - To: [%s] - Server: [%s] - Port: [%s] - Username: [%s] - Password: [********] - Encryption: [%s] - Message: [%s]' % (self.emailfrom, self.emailto, self.emailsvr, self.emailport, self.emailuser, self.emailenc, message))
msg = MIMEMultipart()
msg['From'] = str(self.emailfrom)
msg['To'] = str(self.emailto)
msg['Subject'] = subject
msg.attach(MIMEText(message, 'plain'))
if self.emailenc is 1:
sock = smtplib.SMTP_SSL(self.emailsvr, str(self.emailport))
else:
sock = smtplib.SMTP(self.emailsvr, str(self.emailport))
if self.emailenc is 2:
sock.starttls()
if self.emailuser or self.emailpass:
sock.login(str(self.emailuser), str(self.emailpass))
sock.sendmail(str(self.emailfrom), str(self.emailto), msg.as_string())
sock.quit()
sent_successfully = True
except Exception, e:
logger.warn(module + u' Oh no!! Email notification failed: ' + str(e))
return sent_successfully
def test_notify(self):
return self.notify('Test Message: With great power comes great responsibility.', 'Mylar notification - Test')
class SLACK: class SLACK:
def __init__(self, test_webhook_url=None): def __init__(self, test_webhook_url=None):
self.webhook_url = mylar.CONFIG.SLACK_WEBHOOK_URL if test_webhook_url is None else test_webhook_url self.webhook_url = mylar.CONFIG.SLACK_WEBHOOK_URL if test_webhook_url is None else test_webhook_url
@ -488,10 +417,13 @@ class SLACK:
module = '' module = ''
module += '[NOTIFIER]' module += '[NOTIFIER]'
if all([sent_to is not None, prov is not None]): if 'snatched' in attachment_text.lower():
attachment_text += ' from %s and %s' % (prov, sent_to) snatched_text = '%s: %s' % (attachment_text, snatched_nzb)
elif sent_to is None: if all([sent_to is not None, prov is not None]):
attachment_text += ' from %s' % prov snatched_text += ' from %s and %s' % (prov, sent_to)
elif sent_to is None:
snatched_text += ' from %s' % prov
attachment_text = snatched_text
else: else:
pass pass

View File

@ -1102,7 +1102,7 @@ def torsend2client(seriesname, issue, seriesyear, linkit, site, pubhash=None):
scraper = cfscrape.create_scraper() scraper = cfscrape.create_scraper()
if site == 'WWT': if site == 'WWT':
if mylar.WWT_CF_COOKIEVALUE is None: if mylar.WWT_CF_COOKIEVALUE is None:
cf_cookievalue, cf_user_agent = scraper.get_tokens(newurl, user_agent=mylar.CV_HEADERS['User-Agent']) cf_cookievalue, cf_user_agent = scraper.get_tokens(url, user_agent=mylar.CV_HEADERS['User-Agent'])
mylar.WWT_CF_COOKIEVALUE = cf_cookievalue mylar.WWT_CF_COOKIEVALUE = cf_cookievalue
r = scraper.get(url, params=payload, cookies=mylar.WWT_CF_COOKIEVALUE, verify=verify, stream=True, headers=headers) r = scraper.get(url, params=payload, cookies=mylar.WWT_CF_COOKIEVALUE, verify=verify, stream=True, headers=headers)
else: else:

View File

@ -2729,10 +2729,6 @@ def notify_snatch(sent_to, comicname, comyear, IssueNumber, nzbprov, pack):
logger.info(u"Sending Prowl notification") logger.info(u"Sending Prowl notification")
prowl = notifiers.PROWL() prowl = notifiers.PROWL()
prowl.notify(snatched_name, 'Download started using %s' % sent_to) prowl.notify(snatched_name, 'Download started using %s' % sent_to)
if mylar.CONFIG.NMA_ENABLED and mylar.CONFIG.NMA_ONSNATCH:
logger.info(u"Sending NMA notification")
nma = notifiers.NMA()
nma.notify(snline=snline, snatched_nzb=snatched_name, sent_to=sent_to, prov=nzbprov)
if mylar.CONFIG.PUSHOVER_ENABLED and mylar.CONFIG.PUSHOVER_ONSNATCH: if mylar.CONFIG.PUSHOVER_ENABLED and mylar.CONFIG.PUSHOVER_ONSNATCH:
logger.info(u"Sending Pushover notification") logger.info(u"Sending Pushover notification")
pushover = notifiers.PUSHOVER() pushover = notifiers.PUSHOVER()
@ -2753,6 +2749,10 @@ def notify_snatch(sent_to, comicname, comyear, IssueNumber, nzbprov, pack):
logger.info(u"Sending Slack notification") logger.info(u"Sending Slack notification")
slack = notifiers.SLACK() slack = notifiers.SLACK()
slack.notify("Snatched", snline, snatched_nzb=snatched_name, sent_to=sent_to, prov=nzbprov) slack.notify("Snatched", snline, snatched_nzb=snatched_name, sent_to=sent_to, prov=nzbprov)
if mylar.CONFIG.EMAIL_ENABLED and mylar.CONFIG.EMAIL_ONGRAB:
logger.info(u"Sending email notification")
email = notifiers.EMAIL()
email.notify(snline + " - " + snatched_name, "Mylar notification - Snatch", module="[SEARCH]")
return return

View File

@ -8,33 +8,43 @@ from deluge_client import DelugeRPCClient
class TorrentClient(object): class TorrentClient(object):
def __init__(self): def __init__(self):
self.conn = None self.conn = None
def connect(self, host, username, password): def connect(self, host, username, password, test=False):
if self.conn is not None: if self.conn is not None:
return self.connect return self.connect
if not host: if not host:
return False return {'status': False, 'error': 'No host specified'}
if not username:
return {'status': False, 'error': 'No username specified'}
if not password:
return {'status': False, 'error': 'No password specified'}
# Get port from the config # Get port from the config
host,portnr = host.split(':') host,portnr = host.split(':')
#if username and password:
# logger.info('Connecting to ' + host + ':' + portnr + ' Username: ' + username + ' Password: ' + password ) # logger.info('Connecting to ' + host + ':' + portnr + ' Username: ' + username + ' Password: ' + password )
try: try:
self.client = DelugeRPCClient(host,int(portnr),username,password) self.client = DelugeRPCClient(host,int(portnr),username,password)
except Exception as e: except Exception as e:
logger.error('Could not create DelugeRPCClient Object' + e) logger.error('Could not create DelugeRPCClient Object %s' % e)
return False return {'status': False, 'error': e}
else: else:
try: try:
self.client.connect() self.client.connect()
except Exception as e: except Exception as e:
logger.error('Could not connect to Deluge ' + host) logger.error('Could not connect to Deluge: %s' % host)
return {'status': False, 'error': e}
else: else:
return self.client if test is True:
daemon_version = self.client.call('daemon.info')
libtorrent_version = self.client.call('core.get_libtorrent_version')
return {'status': True, 'daemon_version': daemon_version, 'libtorrent_version': libtorrent_version}
else:
return self.client
def find_torrent(self, hash): def find_torrent(self, hash):
logger.debug('Finding Torrent hash: ' + hash) logger.debug('Finding Torrent hash: ' + hash)
torrent_info = self.get_torrent(hash) torrent_info = self.get_torrent(hash)
@ -85,16 +95,16 @@ class TorrentClient(object):
else: else:
logger.info('Torrent ' + hash + ' was stopped') logger.info('Torrent ' + hash + ' was stopped')
return True return True
def load_torrent(self, filepath): def load_torrent(self, filepath):
logger.info('filepath to torrent file set to : ' + filepath) logger.info('filepath to torrent file set to : ' + filepath)
torrent_id = False torrent_id = False
if self.client.connected is True: if self.client.connected is True:
logger.info('Checking if Torrent Exists!') logger.info('Checking if Torrent Exists!')
if not filepath.startswith('magnet'): if not filepath.startswith('magnet'):
torrentcontent = open(filepath, 'rb').read() torrentcontent = open(filepath, 'rb').read()
hash = str.lower(self.get_the_hash(filepath)) # Deluge expects a lower case hash hash = str.lower(self.get_the_hash(filepath)) # Deluge expects a lower case hash

View File

@ -192,6 +192,8 @@ def checkGithub(current_version=None):
if mylar.COMMITS_BEHIND >= 1: if mylar.COMMITS_BEHIND >= 1:
logger.info('New version is available. You are %s commits behind' % mylar.COMMITS_BEHIND) logger.info('New version is available. You are %s commits behind' % mylar.COMMITS_BEHIND)
if mylar.CONFIG.AUTO_UPDATE is True:
mylar.SIGNAL = 'update'
elif mylar.COMMITS_BEHIND == 0: elif mylar.COMMITS_BEHIND == 0:
logger.info('Mylar is up to date') logger.info('Mylar is up to date')
elif mylar.COMMITS_BEHIND == -1: elif mylar.COMMITS_BEHIND == -1:
@ -319,4 +321,4 @@ def versionload():
if mylar.CONFIG.AUTO_UPDATE: if mylar.CONFIG.AUTO_UPDATE:
if mylar.CURRENT_VERSION != mylar.LATEST_VERSION and mylar.INSTALL_TYPE != 'win' and mylar.COMMITS_BEHIND > 0: if mylar.CURRENT_VERSION != mylar.LATEST_VERSION and mylar.INSTALL_TYPE != 'win' and mylar.COMMITS_BEHIND > 0:
logger.info('Auto-updating has been enabled. Attempting to auto-update.') logger.info('Auto-updating has been enabled. Attempting to auto-update.')
#SIGNAL = 'update' mylar.SIGNAL = 'update'

View File

@ -2189,45 +2189,75 @@ class WebInterface(object):
annualDelete.exposed = True annualDelete.exposed = True
def ddl_requeue(self, id, mode): def ddl_requeue(self, mode, id=None):
myDB = db.DBConnection() myDB = db.DBConnection()
item = myDB.selectone("SELECT * FROM DDL_INFO WHERE ID=?", [id]).fetchone() if id is None:
if item is not None: items = myDB.select("SELECT * FROM ddl_info WHERE status = 'Queued' ORDER BY updated_date DESC")
if mode == 'resume': else:
if item['status'] != 'Completed': oneitem = myDB.selectone("SELECT * FROM DDL_INFO WHERE ID=?", [id]).fetchone()
filesize = os.stat(os.path.join(mylar.CONFIG.DDL_LOCATION, item['filename'])).st_size items = [oneitem]
mylar.DDL_QUEUE.put({'link': item['link'],
'mainlink': item['mainlink'],
'series': item['series'],
'year': item['year'],
'size': item['size'],
'comicid': item['comicid'],
'issueid': item['issueid'],
'id': item['id'],
'resume': filesize})
itemlist = [x for x in items]
if itemlist is not None:
for item in itemlist:
if all([mylar.CONFIG.DDL_AUTORESUME is True, mode == 'resume', item['status'] != 'Completed']):
try:
filesize = os.stat(os.path.join(mylar.CONFIG.DDL_LOCATION, item['filename'])).st_size
except:
filesize = 0
resume = filesize
elif mode == 'abort':
myDB.upsert("ddl_info", {'Status': 'Failed'}, {'id': id}) #DELETE FROM ddl_info where ID=?', [id])
continue
elif mode == 'remove':
myDB.action('DELETE FROM ddl_info where ID=?', [id])
continue
else:
resume = None
mylar.DDL_QUEUE.put({'link': item['link'],
'mainlink': item['mainlink'],
'series': item['series'],
'year': item['year'],
'size': item['size'],
'comicid': item['comicid'],
'issueid': item['issueid'],
'id': item['id'],
'resume': resume})
linemessage = '%s successful for %s' % (mode, oneitem['series'])
if mode == 'restart_queue':
logger.info('[DDL-RESTART-QUEUE] DDL Queue successfully restarted. Put %s items back into the queue for downloading..' % len(itemlist))
linemessage = 'Successfully restarted Queue'
elif mode == 'restart':
logger.info('[DDL-RESTART] Successfully restarted %s [%s] for downloading..' % (oneitem['series'], oneitem['size']))
elif mode == 'requeue':
logger.info('[DDL-REQUEUE] Successfully requeued %s [%s] for downloading..' % (oneitem['series'], oneitem['size']))
elif mode == 'abort':
logger.info('[DDL-ABORT] Successfully aborted downloading of %s [%s]..' % (oneitem['series'], oneitem['size']))
elif mode == 'remove':
logger.info('[DDL-REMOVE] Successfully removed %s [%s]..' % (oneitem['series'], oneitem['size']))
return json.dumps({'status': True, 'message': linemessage})
ddl_requeue.exposed = True ddl_requeue.exposed = True
def queueManage(self): # **args): def queueManage(self): # **args):
myDB = db.DBConnection() myDB = db.DBConnection()
activelist = 'There are currently no items currently downloading via Direct Download (DDL).'
active = myDB.selectone("SELECT * FROM DDL_INFO WHERE STATUS = 'Downloading'").fetchone()
if active is not None:
activelist ={'series': active['series'],
'year': active['year'],
'size': active['size'],
'filename': active['filename'],
'status': active['status'],
'id': active['id']}
resultlist = 'There are currently no items waiting in the Direct Download (DDL) Queue for processing.' resultlist = 'There are currently no items waiting in the Direct Download (DDL) Queue for processing.'
s_info = myDB.select("SELECT a.ComicName, a.ComicVersion, a.ComicID, a.ComicYear, b.Issue_Number, b.IssueID, c.size, c.status, c.id, c.updated_date FROM comics as a INNER JOIN issues as b ON a.ComicID = b.ComicID INNER JOIN ddl_info as c ON b.IssueID = c.IssueID WHERE c.status != 'Downloading'") s_info = myDB.select("SELECT a.ComicName, a.ComicVersion, a.ComicID, a.ComicYear, b.Issue_Number, b.IssueID, c.size, c.status, c.id, c.updated_date, c.issues, c.year FROM comics as a INNER JOIN issues as b ON a.ComicID = b.ComicID INNER JOIN ddl_info as c ON b.IssueID = c.IssueID") # WHERE c.status != 'Downloading'")
o_info = myDB.select("Select a.ComicName, b.Issue_Number, a.IssueID, a.ComicID, c.size, c.status, c.id, c.updated_date, c.issues, c.year from oneoffhistory a join snatched b on a.issueid=b.issueid join ddl_info c on b.issueid=c.issueid where b.provider = 'ddl'")
if s_info: if s_info:
resultlist = [] resultlist = []
for si in s_info: for si in s_info:
issue = si['Issue_Number'] if si['issues'] is None:
if issue is not None: issue = si['Issue_Number']
issue = '#%s' % issue year = si['ComicYear']
if issue is not None:
issue = '#%s' % issue
else:
year = si['year']
issue = '#%s' % si['issues']
if si['status'] == 'Completed': if si['status'] == 'Completed':
si_status = '100%' si_status = '100%'
else: else:
@ -2236,18 +2266,161 @@ class WebInterface(object):
'issue': issue, 'issue': issue,
'id': si['id'], 'id': si['id'],
'volume': si['ComicVersion'], 'volume': si['ComicVersion'],
'year': si['ComicYear'], 'year': year,
'size': si['size'].strip(), 'size': si['size'].strip(),
'comicid': si['ComicID'], 'comicid': si['ComicID'],
'issueid': si['IssueID'], 'issueid': si['IssueID'],
'status': si['status'], 'status': si['status'],
'updated_date': si['updated_date'], 'updated_date': si['updated_date'],
'progress': si_status}) 'progress': si_status})
if o_info:
if type(resultlist) is str:
resultlist = []
logger.info('resultlist: %s' % resultlist) for oi in o_info:
return serve_template(templatename="queue_management.html", title="Queue Management", activelist=activelist, resultlist=resultlist) if oi['issues'] is None:
issue = oi['Issue_Number']
year = oi['year']
if issue is not None:
issue = '#%s' % issue
else:
year = oi['year']
issue = '#%s' % oi['issues']
if oi['status'] == 'Completed':
oi_status = '100%'
else:
oi_status = ''
resultlist.append({'series': oi['ComicName'],
'issue': issue,
'id': oi['id'],
'volume': None,
'year': year,
'size': oi['size'].strip(),
'comicid': oi['ComicID'],
'issueid': oi['IssueID'],
'status': oi['status'],
'updated_date': oi['updated_date'],
'progress': oi_status})
return serve_template(templatename="queue_management.html", title="Queue Management", resultlist=resultlist) #activelist=activelist, resultlist=resultlist)
queueManage.exposed = True queueManage.exposed = True
def queueManageIt(self, iDisplayStart=0, iDisplayLength=100, iSortCol_0=0, sSortDir_0="desc", sSearch="", **kwargs):
iDisplayStart = int(iDisplayStart)
iDisplayLength = int(iDisplayLength)
filtered = []
myDB = db.DBConnection()
resultlist = 'There are currently no items waiting in the Direct Download (DDL) Queue for processing.'
s_info = myDB.select("SELECT a.ComicName, a.ComicVersion, a.ComicID, a.ComicYear, b.Issue_Number, b.IssueID, c.size, c.status, c.id, c.updated_date, c.issues, c.year FROM comics as a INNER JOIN issues as b ON a.ComicID = b.ComicID INNER JOIN ddl_info as c ON b.IssueID = c.IssueID") # WHERE c.status != 'Downloading'")
o_info = myDB.select("Select a.ComicName, b.Issue_Number, a.IssueID, a.ComicID, c.size, c.status, c.id, c.updated_date, c.issues, c.year from oneoffhistory a join snatched b on a.issueid=b.issueid join ddl_info c on b.issueid=c.issueid where b.provider = 'ddl'")
if s_info:
resultlist = []
for si in s_info:
if si['issues'] is None:
issue = si['Issue_Number']
year = si['ComicYear']
if issue is not None:
issue = '#%s' % issue
else:
year = si['year']
issue = '#%s' % si['issues']
if si['status'] == 'Completed':
si_status = '100%'
else:
si_status = ''
if issue is not None:
if si['ComicVersion'] is not None:
series = '%s %s %s (%s)' % (si['ComicName'], si['ComicVersion'], issue, year)
else:
series = '%s %s (%s)' % (si['ComicName'], issue, year)
else:
if si['ComicVersion'] is not None:
series = '%s %s (%s)' % (si['ComicName'], si['ComicVersion'], year)
else:
series = '%s (%s)' % (si['ComicName'], year)
resultlist.append({'series': series, #i['ComicName'],
'issue': issue,
'queueid': si['id'],
'volume': si['ComicVersion'],
'year': year,
'size': si['size'].strip(),
'comicid': si['ComicID'],
'issueid': si['IssueID'],
'status': si['status'],
'updated_date': si['updated_date'],
'progress': si_status})
if o_info:
if type(resultlist) is str:
resultlist = []
for oi in o_info:
if oi['issues'] is None:
issue = oi['Issue_Number']
year = oi['year']
if issue is not None:
issue = '#%s' % issue
else:
year = oi['year']
issue = '#%s' % oi['issues']
if oi['status'] == 'Completed':
oi_status = '100%'
else:
oi_status = ''
if issue is not None:
series = '%s %s (%s)' % (oi['ComicName'], issue, year)
else:
series = '%s (%s)' % (oi['ComicName'], year)
resultlist.append({'series': series,
'issue': issue,
'queueid': oi['id'],
'volume': None,
'year': year,
'size': oi['size'].strip(),
'comicid': oi['ComicID'],
'issueid': oi['IssueID'],
'status': oi['status'],
'updated_date': oi['updated_date'],
'progress': oi_status})
if sSearch == "" or sSearch == None:
filtered = resultlist[::]
else:
filtered = [row for row in resultlist if any([sSearch.lower() in row['series'].lower(), sSearch.lower() in row['status'].lower()])]
sortcolumn = 'series'
if iSortCol_0 == '1':
sortcolumn = 'series'
elif iSortCol_0 == '2':
sortcolumn = 'size'
elif iSortCol_0 == '3':
sortcolumn = 'progress'
elif iSortCol_0 == '4':
sortcolumn = 'status'
elif iSortCol_0 == '5':
sortcolumn = 'updated_date'
filtered.sort(key=lambda x: x[sortcolumn], reverse=sSortDir_0 == "desc")
rows = filtered[iDisplayStart:(iDisplayStart + iDisplayLength)]
rows = [[row['comicid'], row['series'], row['size'], row['progress'], row['status'], row['updated_date'], row['queueid']] for row in rows]
#rows = [{'comicid': row['comicid'], 'series': row['series'], 'size': row['size'], 'progress': row['progress'], 'status': row['status'], 'updated_date': row['updated_date']} for row in rows]
#logger.info('rows: %s' % rows)
return json.dumps({
'iTotalDisplayRecords': len(filtered),
'iTotalRecords': len(resultlist),
'aaData': rows,
})
queueManageIt.exposed = True
def previewRename(self, **args): #comicid=None, comicidlist=None): def previewRename(self, **args): #comicid=None, comicidlist=None):
file_format = mylar.CONFIG.FILE_FORMAT file_format = mylar.CONFIG.FILE_FORMAT
@ -4068,7 +4241,7 @@ class WebInterface(object):
mylar.CONFIG.IMP_METADATA = bool(imp_metadata) mylar.CONFIG.IMP_METADATA = bool(imp_metadata)
mylar.CONFIG.IMP_PATHS = bool(imp_paths) mylar.CONFIG.IMP_PATHS = bool(imp_paths)
mylar.CONFIG.configure(update=True, startup=False) mylar.CONFIG.configure(update=True)
# Write the config # Write the config
logger.info('Now updating config...') logger.info('Now updating config...')
mylar.CONFIG.writeconfig() mylar.CONFIG.writeconfig()
@ -4895,10 +5068,6 @@ class WebInterface(object):
"prowl_onsnatch": helpers.checked(mylar.CONFIG.PROWL_ONSNATCH), "prowl_onsnatch": helpers.checked(mylar.CONFIG.PROWL_ONSNATCH),
"prowl_keys": mylar.CONFIG.PROWL_KEYS, "prowl_keys": mylar.CONFIG.PROWL_KEYS,
"prowl_priority": mylar.CONFIG.PROWL_PRIORITY, "prowl_priority": mylar.CONFIG.PROWL_PRIORITY,
"nma_enabled": helpers.checked(mylar.CONFIG.NMA_ENABLED),
"nma_apikey": mylar.CONFIG.NMA_APIKEY,
"nma_priority": int(mylar.CONFIG.NMA_PRIORITY),
"nma_onsnatch": helpers.checked(mylar.CONFIG.NMA_ONSNATCH),
"pushover_enabled": helpers.checked(mylar.CONFIG.PUSHOVER_ENABLED), "pushover_enabled": helpers.checked(mylar.CONFIG.PUSHOVER_ENABLED),
"pushover_onsnatch": helpers.checked(mylar.CONFIG.PUSHOVER_ONSNATCH), "pushover_onsnatch": helpers.checked(mylar.CONFIG.PUSHOVER_ONSNATCH),
"pushover_apikey": mylar.CONFIG.PUSHOVER_APIKEY, "pushover_apikey": mylar.CONFIG.PUSHOVER_APIKEY,
@ -4920,6 +5089,18 @@ class WebInterface(object):
"slack_enabled": helpers.checked(mylar.CONFIG.SLACK_ENABLED), "slack_enabled": helpers.checked(mylar.CONFIG.SLACK_ENABLED),
"slack_webhook_url": mylar.CONFIG.SLACK_WEBHOOK_URL, "slack_webhook_url": mylar.CONFIG.SLACK_WEBHOOK_URL,
"slack_onsnatch": helpers.checked(mylar.CONFIG.SLACK_ONSNATCH), "slack_onsnatch": helpers.checked(mylar.CONFIG.SLACK_ONSNATCH),
"email_enabled": helpers.checked(mylar.CONFIG.EMAIL_ENABLED),
"email_from": mylar.CONFIG.EMAIL_FROM,
"email_to": mylar.CONFIG.EMAIL_TO,
"email_server": mylar.CONFIG.EMAIL_SERVER,
"email_user": mylar.CONFIG.EMAIL_USER,
"email_password": mylar.CONFIG.EMAIL_PASSWORD,
"email_port": int(mylar.CONFIG.EMAIL_PORT),
"email_raw": helpers.radio(int(mylar.CONFIG.EMAIL_ENC), 0),
"email_ssl": helpers.radio(int(mylar.CONFIG.EMAIL_ENC), 1),
"email_tls": helpers.radio(int(mylar.CONFIG.EMAIL_ENC), 2),
"email_ongrab": helpers.checked(mylar.CONFIG.EMAIL_ONGRAB),
"email_onpost": helpers.checked(mylar.CONFIG.EMAIL_ONPOST),
"enable_extra_scripts": helpers.checked(mylar.CONFIG.ENABLE_EXTRA_SCRIPTS), "enable_extra_scripts": helpers.checked(mylar.CONFIG.ENABLE_EXTRA_SCRIPTS),
"extra_scripts": mylar.CONFIG.EXTRA_SCRIPTS, "extra_scripts": mylar.CONFIG.EXTRA_SCRIPTS,
"enable_snatch_script": helpers.checked(mylar.CONFIG.ENABLE_SNATCH_SCRIPT), "enable_snatch_script": helpers.checked(mylar.CONFIG.ENABLE_SNATCH_SCRIPT),
@ -5188,9 +5369,9 @@ class WebInterface(object):
'failed_auto', 'post_processing', 'enable_check_folder', 'enable_pre_scripts', 'enable_snatch_script', 'enable_extra_scripts', 'failed_auto', 'post_processing', 'enable_check_folder', 'enable_pre_scripts', 'enable_snatch_script', 'enable_extra_scripts',
'enable_meta', 'cbr2cbz_only', 'ct_tag_cr', 'ct_tag_cbl', 'ct_cbz_overwrite', 'rename_files', 'replace_spaces', 'zero_level', 'enable_meta', 'cbr2cbz_only', 'ct_tag_cr', 'ct_tag_cbl', 'ct_cbz_overwrite', 'rename_files', 'replace_spaces', 'zero_level',
'lowercase_filenames', 'autowant_upcoming', 'autowant_all', 'comic_cover_local', 'alternate_latest_series_covers', 'cvinfo', 'snatchedtorrent_notify', 'lowercase_filenames', 'autowant_upcoming', 'autowant_all', 'comic_cover_local', 'alternate_latest_series_covers', 'cvinfo', 'snatchedtorrent_notify',
'prowl_enabled', 'prowl_onsnatch', 'nma_enabled', 'nma_onsnatch', 'pushover_enabled', 'pushover_onsnatch', 'boxcar_enabled', 'prowl_enabled', 'prowl_onsnatch', 'pushover_enabled', 'pushover_onsnatch', 'boxcar_enabled',
'boxcar_onsnatch', 'pushbullet_enabled', 'pushbullet_onsnatch', 'telegram_enabled', 'telegram_onsnatch', 'slack_enabled', 'slack_onsnatch', 'boxcar_onsnatch', 'pushbullet_enabled', 'pushbullet_onsnatch', 'telegram_enabled', 'telegram_onsnatch', 'slack_enabled', 'slack_onsnatch',
'opds_enable', 'opds_authentication', 'opds_metainfo', 'enable_ddl'] 'email_enabled', 'email_enc', 'email_ongrab', 'email_onpost', 'opds_enable', 'opds_authentication', 'opds_metainfo', 'enable_ddl']
for checked_config in checked_configs: for checked_config in checked_configs:
if checked_config not in kwargs: if checked_config not in kwargs:
@ -5253,7 +5434,7 @@ class WebInterface(object):
mylar.CONFIG.process_kwargs(kwargs) mylar.CONFIG.process_kwargs(kwargs)
#this makes sure things are set to the default values if they're not appropriately set. #this makes sure things are set to the default values if they're not appropriately set.
mylar.CONFIG.configure(update=True) mylar.CONFIG.configure(update=True, startup=False)
# Write the config # Write the config
logger.info('Now saving config...') logger.info('Now saving config...')
@ -5680,16 +5861,6 @@ class WebInterface(object):
return mylar.rsscheck.torrents(pickfeed='4', seriesname=search) return mylar.rsscheck.torrents(pickfeed='4', seriesname=search)
search_32p.exposed = True search_32p.exposed = True
def testNMA(self, apikey):
nma = notifiers.NMA(test_apikey=apikey)
result = nma.test_notify()
if result['status'] == True:
return result['message']
else:
logger.warn('APIKEY used for test was : %s' % apikey)
return result['message']
testNMA.exposed = True
def testprowl(self): def testprowl(self):
prowl = notifiers.prowl() prowl = notifiers.prowl()
result = prowl.test_notify() result = prowl.test_notify()
@ -5749,6 +5920,16 @@ class WebInterface(object):
return "Error sending test message to Slack" return "Error sending test message to Slack"
testslack.exposed = True testslack.exposed = True
def testemail(self, emailfrom, emailto, emailsvr, emailport, emailuser, emailpass, emailenc):
email = notifiers.EMAIL(test_emailfrom=emailfrom, test_emailto=emailto, test_emailsvr=emailsvr, test_emailport=emailport, test_emailuser=emailuser, test_emailpass=emailpass, test_emailenc=emailenc)
result = email.test_notify()
if result == True:
return "Successfully sent email. Check your mailbox."
else:
logger.warn('Email test has gone horribly wrong. Variables used were [FROM: %s] [TO: %s] [SERVER: %s] [PORT: %s] [USER: %s] [PASSWORD: ********] [ENCRYPTION: %s]' % (emailfrom, emailto, emailsvr, emailport, emailuser, emailenc))
return "Error sending test message via email"
testemail.exposed = True
def testrtorrent(self, host, username, password, auth, verify, rpc_url): def testrtorrent(self, host, username, password, auth, verify, rpc_url):
import torrent.clients.rtorrent as TorClient import torrent.clients.rtorrent as TorClient
@ -5778,14 +5959,38 @@ class WebInterface(object):
return 'Error establishing connection to Qbittorrent' return 'Error establishing connection to Qbittorrent'
else: else:
if qclient['status'] is False: if qclient['status'] is False:
logger.warn('[qBittorrent] Could not establish connection to %s. Error returned:' % (host, qclient['error'])) logger.warn('[qBittorrent] Could not establish connection to %s. Error returned: %s' % (host, qclient['error']))
return 'Error establishing connection to Qbittorrent' return 'Error establishing connection to Qbittorrent'
else: else:
logger.info('[qBittorrent] Successfully validated connection to %s [%s]' % (host, qclient['version'])) logger.info('[qBittorrent] Successfully validated connection to %s [v%s]' % (host, qclient['version']))
return 'Successfully validated qBittorrent connection' return 'Successfully validated qBittorrent connection'
testqbit.exposed = True testqbit.exposed = True
def testdeluge(self, host, username, password):
import torrent.clients.deluge as DelugeClient
client = DelugeClient.TorrentClient()
dclient = client.connect(host, username, password, True)
if not dclient:
logger.warn('[Deluge] Could not establish connection to %s' % host)
return 'Error establishing connection to Deluge'
else:
if dclient['status'] is False:
logger.warn('[Deluge] Could not establish connection to %s. Error returned: %s' % (host, dclient['error']))
return 'Error establishing connection to Deluge'
else:
logger.info('[Deluge] Successfully validated connection to %s [daemon v%s; libtorrent v%s]' % (host, dclient['daemon_version'], dclient['libtorrent_version']))
return 'Successfully validated Deluge connection'
testdeluge.exposed = True
def testnewznab(self, name, host, ssl, apikey): def testnewznab(self, name, host, ssl, apikey):
logger.fdebug('ssl/verify: %s' % ssl)
if 'ssl' == '0' or ssl == '1':
ssl = bool(int(ssl))
else:
if ssl == 'false':
ssl = False
else:
ssl = True
result = helpers.newznab_test(name, host, ssl, apikey) result = helpers.newznab_test(name, host, ssl, apikey)
if result is True: if result is True:
logger.info('Successfully tested %s [%s] - valid api response received' % (name, host)) logger.info('Successfully tested %s [%s] - valid api response received' % (name, host))
@ -5894,11 +6099,31 @@ class WebInterface(object):
myDB = db.DBConnection() myDB = db.DBConnection()
active = myDB.selectone("SELECT * FROM DDL_INFO WHERE STATUS = 'Downloading'").fetchone() active = myDB.selectone("SELECT * FROM DDL_INFO WHERE STATUS = 'Downloading'").fetchone()
if active is None: if active is None:
return "There are no active downloads currently being attended to" return json.dumps({'status': 'There are no active downloads currently being attended to',
'percent': 0,
'a_series': None,
'a_year': None,
'a_filename': None,
'a_size': None,
'a_id': None})
else: else:
filesize = os.stat(os.path.join(mylar.CONFIG.DDL_LOCATION, active['filename'])).st_size filelocation = os.path.join(mylar.CONFIG.DDL_LOCATION, active['filename'])
cmath = int(float(filesize*100)/int(int(active['remote_filesize'])*100) * 100) #logger.fdebug('checking file existance: %s' % filelocation)
return "%s%s" % (cmath, '%') if os.path.exists(filelocation) is True:
filesize = os.stat(filelocation).st_size
cmath = int(float(filesize*100)/int(int(active['remote_filesize'])*100) * 100)
#logger.fdebug('ACTIVE DDL: %s %s [%s]' % (active['filename'], cmath, 'Downloading'))
return json.dumps({'status': 'Downloading',
'percent': "%s%s" % (cmath, '%'),
'a_series': active['series'],
'a_year': active['year'],
'a_filename': active['filename'],
'a_size': active['size'],
'a_id': active['id']})
else:
# myDB.upsert('ddl_info', {'status': 'Incomplete'}, {'id': active['id']})
return json.dumps({'a_id': active['id'], 'status': 'File does not exist in %s.</br> This probably needs to be restarted (use the option in the GUI)' % filelocation, 'percent': 0})
check_ActiveDDL.exposed = True check_ActiveDDL.exposed = True
def create_readlist(self, list=None, weeknumber=None, year=None): def create_readlist(self, list=None, weeknumber=None, year=None):