Outliner Software Forum RSS Feed Forum Posts Feed

Subscribe by Email

CRIMP Defined

 

Tip Jar

How do you mark the internet as "finished"?

View this topic | Back to topic list

Posted by Wayne K
Sep 29, 2013 at 05:54 PM

 

Dr Andus wrote:
Wayne K wrote:
>> It’s just a waste of time to download the
>>article again, figure out that it’s a duplicate, and delete it.  Often
>I
>>change the name of the file when I download it from whatever default
>>comes up.  The next time I download the same article I might give it a
>>slightly different name, so then I have open both files to make sure
>>they’re duplicates.
> >How about using some web capture software instead like Surfulater? Then
>you wouldn’t need to go through the PDF production process, and it’s
>very quick to capture a page (right-click, “add new article to
>Surfulater/or whatever” and that’s it).
> >You can organise the pages into folders, and then it’s a lot easier to
>see whether there is any duplication in the folder. Plus there is the
>Filter tool in SF that filters the captured content on the basis of the
>title of the web page, again, to see duplication.
> >Though I imagine you must have your reasons why you want them as PDFs.

I tried web capture software a couple of years ago.  The stopping point for me was the poor mark-up capabilities.  Surfulator couldn’t even highlight text that you’ve captured (I confirmed that with their tech support - maybe it’s changed since then).  By capturing them as PDF’s, I can take advantage of software that can do any kind of markup I can imagine.  I use PDF Revu but I know there are other excellent choices for pdf markups.

I also like the idea of staying relatively software-neutral with a file format that’s likely to be around longer than I am.

As for seeing duplicates in folders, I can do the same thing with pdf files.  It’s just that it slows down the research and capture process.  I like having some automatic organization on the front-end to save time on the back-end.