Es gab ein Problem beim Laden der Kommentare.

Introduction and basic usage

HelpDesk  »  Knowledgebase  »  Artikel betrachten


What does the Folder Watch addon do?

Folder Watch is an addon which can be installed and enabled via Settings -> Extensions (scroll down) -> Folder Watch
Folder Watch allows you to let JD monitor one- or multiple folders for special .crawljob files and containers such as .DLC files.
This can be useful if you are e.g. running JD on a server and want it to process all URLs added via this method without the need of any further user interaction in JD.
After activating the Folder Watch addon, JD is by default monitoring the folder [JD Install dir]/folderwatch every 1000ms.
Processed files will be moved to a subfolder inside the watched folder called "added", e.g. to [JD Install dir]/folderwatch/added.
To start you can do this simple test using a .DLC container:
1. Install and enable Folder Watch.

2. Open JDownloader and export some added URLs as .DLC container via Rightclick -> Other -> Create DLC
3. Delete the previously exported URLs in JD and move the created .DLC file in the above mentioned "folderwatch" default folder.
4. After some seconds, the URLs inside your .DLC container will appear in your LinkGrabber and the .DLC container will be moved to "[JD Install dir]folderwatch/added".
If you want to use the full potential of Folder Watch, continue reading!

.crawljob files and how to use them:

Using .crawljob files, you can tell JD how to process the URLs which will get added whenever it processes said .crawljob files.
The below explained .crawljob files can also be used without FolderWatch to e.g. to add URLs once with custom package name/download path and so on!
You can add .crawljob files to the LinkGrabber just like adding .DLC containers or normal URLs! The FolderWatch extension has to be enabled though to make this work!

Here is an overview of all possible crawljob fields:
Additional information:
UNSET = Existing global setting will be used
You may as well leave out fields completely instead of using the value UNSET.

Field name Description Data-type / example(s)
priority Download priority ENUM
priority=HIGHEST
priority=HIGHER
priority=HIGH
priority=DEFAULT
priority=LOW
priority=LOWER
priority=LOWEST
comment Comment String
comment=ExampleComment
text Text containing URL(s) to add String
text=http://www.hornoxe.com/kreuzfahrtschiff-wird-verlaengert-zeitraffer/
enabled Enable/disable items added via this crawljob

BooleanStatus
enabled=TRUE
enabled=TRUE
enabled=UNSET

type ENUM
type=NORMAL
downloadPassword Set this if password is required to add this URL e.g. folder password. String
downloadPassword=MyDownloadPassword
downloadFolder Sets download path String
chunks Max. number of connections per download int [0-20]
chunks=0 | Use globally defined limit
chunks=3 | try to force 3 connections
packageName Package name String
packageName=DesiredPackageName
filename Filename
Use only if text contains a single URL only!
String
filename=ForceThisFilename.zip
forcedStart Enforce immediate downloadstart? BooleanStatus
forcedStart=TRUE
forcedStart=TRUE
forcedStart=UNSET
autoStart Auto start downloads after this item has been added to the LinkGrabber? BooleanStatus
autoStart=TRUE
autoStart=FALSE
autoStart=UNSET
autoConfirm Auto move added item(s) to downloadlist BooleanStatus
autoConfirm=TRUE
autoConfirm=FALSE
autoConfirm=UNSET
overwritePackagizerEnabled Should properties set via this crawljob overwrite packagizer rules? Boolean
overwritePackagizerEnabled=true
overwritePackagizerEnabled=false
setBeforePackagizerEnabled Boolean setBeforePackagizerEnabled=true
setBeforePackagizerEnabled=false
deepAnalyseEnabled

Deep-analyze URLs added via this crawljob?

Useful for URLs of websites which are not supported via JD plugin e.g. URLs which will contain more (direct-downloadable) URLs inside HTML code.

Boolean
deepAnalyseEnabled=false
addOfflineLink Add dummy-offline link if added items were processed by a crawler and this crawler has detected that the URL you were trying to crawl is offline. Boolean
extractAfterDownload BooleanStatus
extractAfterDownload=TRUE
extractAfterDownload=FALSE
extractAfterDownload=UNSET
extractPasswords List of possible extraction passwords. String[]
extractPasswords=["Password1","Password2"]

There are two allowed crawljob formats. Both can be used by storing the data as .crawljob file and moving this file into the folder which JD is watching.

Format 1: Text format. This can only hold one crawljob:

extractPasswords=["Password1","Password2"]
downloadPassword=123456Test
enabled=TRUE
text=http://cdn8.appwork.org/speed.zip
packageName=MyPackageName
filename=NewFilename.zip
comment=SuperUsefulComment
autoConfirm=TRUE
autoStart=TRUE
extractAfterDownload=FALSE
forcedStart=FALSE
downloadFolder=C:\Users\test\Downloads
overwritePackagizerEnabled=false
The above example is attached to this article as example1.crawljob!

Format 2: json format: You can put multiple crawljobs into one file:

[{
"extractPasswords": ["Password1","Password2"],
"downloadPassword": "123456Test",
"enabled": "TRUE",
"text": "http://cdn8.appwork.org/speed.zip",
"packageName": "MyPackageName",
"filename": "NewFilename.zip",
"comment": "SuperUsefulComment",
"autoConfirm": "TRUE",
"autoStart": "TRUE",
"extractAfterDownload": "FALSE",
"forcedStart": "FALSE",
"downloadFolder": "C:\\Users\\test\\Downloads",
"overwritePackagizerEnabled": false
}]

The above example is attached to this article as example1_json.crawljob!
Please keep in mind that syntax errors in your json will lead to a failure when trying to process your crawljob.
It will be moved to the "added" folder anyways!

Save one of the above variants in a .crawljob file or see attached files example1.crawljob and example1_json.crawljob.


What does the above example do?

You can guess most of it by looking at the LinkGrabber tab in JD once the URL gets added but to make it clear here is a screenshot with an explanation below:

  • Adds URL http://cdn8.appwork.org/speed.zip
  • Sets two possible extract passwords on all added items: "Password1" and "Password2"
  • Disables extraction
  • Sets download password "123456Test"
  • Enables added item(s)
  • Sets packagename to MyPackageName
  • Sets filename to NewFilename.zip
  • Enables auto confirm and autostart -> Added item(s) will get moved to the downloadlist and downloads will be started after a few seconds
  • Sets download-folder to C:\Users\test\Downloads


    Ähnliche Artikel


    On-Premise Help Desk Software by SupportPal