Outliner Software Forum RSS Feed Forum Posts Feed

Subscribe by Email

CRIMP Defined

 

Tip Jar

Evernote 2 Beta

< Next Topic | Back to topic list | Previous Topic >

Posted by Daly de Gagne
Feb 16, 2007 at 02:45 PM

 

Evernote has gone into beta for version 2, and is promising a number of improvements related to handling information.

Here is a clip from the Evernote site:
    NEW LOOK AND FEEL
  * New UI elements, perforated note splitter and improved Notebar make EverNotes design more consistent.
    NEW NAVIGATION FUNCTIONALITY
  * New top/bottom arrows on the Accelerator Bar allow quick jumping to the beginning and end of the tape.
  * An option has been added that places the tape scroll bar on the left side for left-handed users.
  * A View Menu has been added for easy navigation for menu-oriented users.
    ENHANCED CLIPPING
  * A Launch Universal Clipper command has been added in the Tools.
  * New option to launch Universal Clipper when EverNote starts up has been added to the Tools>Options menu.
    IMPROVED ORGANIZATION OF NOTES
  * The ability to manually sort categories has been added to enable users to arrange categories in any way they like.
  * A Sort Sibling Categories option in the Tools->Options menu now allows automatic sorting of all sub-categories.
  * A Refresh Filter button in Properties has been added.
  * Multi-color flag icons have been added in the Category Properties panel.
    IMPROVED BACKUP AND MAINTENANCE
  * A Backup/Restore Model in which backups are stored in binary ENB format instead of XML to enable faster recovery and better protect private data from unauthorized access.
  * A redesigned Restore window now has two new buttons, Open and Delete, for more flexible management of backups.
  * An Advanced tab has been added in the File->Properties menu with Compact Database and Refresh Filters buttons.
  * A new version of InstallShield for software setup that is compatible with Windows XP and Vista requirements. It also enables incremental hotfixes.

Bug Fixes & Improvements:

    EDITING
  * Fixed a bug that prevented the automatic adding of rows in template tables on the Windows Vista platform.
  * Fixed the Print Preview toolbar icons to exclude the image of the New Note toolbar button.
  * Fixed a bug that disabled tape scrollbar when there were notes in the Note List.
  * Fixed a bug that prevented scrolling of the tape when a note was selected in the Note List.
  * Fixed a bug that left the To-Do category assigned to a note even after an embedded To-Do Box was deleted from the note.
  * Fixed a bug showing all categories in a notes Categories Dialog.
    CLIPPING
  * Fixed a bug that prevented image clipping if no content or image were selected in the web page.
  * Fixed a bug that added frameset as a table.
    HELP FILE
  * Updated the Help file to reflect changes in desktop and portable versions.
  * Fixed a bug to close the Help file during upgrade.
    INK NOTES
  * Incorporated improved (repair) ink filter engine for better ink recognition.
  * Fixed a bug that caused figures to disappear when drawing with “Shape Recognizer” turned on.
  * Fixed a bug that caused crashes on the Windows Vista platform.

I have not tried all of the features. They sound good, and for people who like the Evernote metaphor, should be welcome,

My one main concern with Evernote is its inability in previous—and so far in this beta verion—versions to capture web pages accurately, placing elements in their proper position, rather than stringing them in a great vertical line that means the main aspects of the page are fairly far down. Sometimes EN makes an accurate capture; sometimes it does not. Given that I capture dozens of news pages a month, and often far more, this inability just is not acceptable. I have failed to understand why, given the current state of technical know-how, programs with aspirations of having a web clip function cannot pull it off well. So far, the best at doing this is Surfulater http://www.surfulater.com/index.html .

Nonetheless, I will try out Evernote’s beta for its other feature. I am glad the program continues to develop because it is clear there is a loyal and growing following for it.

Daly

 


Posted by Graham Rhind
Feb 16, 2007 at 04:56 PM

 

Daly,

I’m not a user of Surfulator - is it able to capture all web pages “properly”?

Being in the process now of upgrading my website, I would suggest that the problem lies in the increasing use of cascading style sheets (CSS) on websites, whereby the look and feel of the html page is governed by the CSS file.  I would think that web clippers download the html page but do not download the CSS files (if they did I would imagine that they would be difficult to store and be used by the clipping programs, as each website will have their own CSS files and they would be likely to clash),  Look at an html page which accesses CSS files WITHOUT the CSS files and, indeed, you’d see the text in strange positions on the screen.

Also, CSS files add bloat to websites, so a clipper which downloaded them too would have to deal with that bloat.

Not sure how they’ll resolve that one (and it doesn’t bother me too much, I have to say). but I am intrigued if there is a web clipper that has managed to get around the issue.

Graham

 


Posted by Graham Rhind
Feb 16, 2007 at 05:49 PM

 

Additional to post above: Surfulator, in its help file, admits it cannot use CSS from source sites because each site uses a different CSS file.  What it does do is use its own built-in CSS files to give web pages a layout which may not be the same as in the original site, but places the components in a better way than leaving it to html to decide.

Clever .....

 


Posted by Daly de Gagne
Feb 16, 2007 at 08:10 PM

 

Graham, thanks for the insights. What you say makes sense. I have read also your post after this one to which I am replying. When I had said that Surfulater does an accurate, though not perfect, job of caputring a particular web page, it is probably because it is using its own resources to create a reasonable facsimilie of the original. The vast majority of the time, I am finding Surfulater’s page captures are more than satisfactory even if they are not absolutely identical in appearance to the original. However, most other programs I cannot say the same thing for—and that is the main reason I am using Surfulater and not some of the other rpograms.

Thanks again for the insights.

Daly

Graham Rhind wrote:
>Daly,
> >I’m not a user of Surfulator - is it able to capture all web pages
>“properly”?
> >Being in the process now of upgrading my website, I would suggest that
>the problem lies in the increasing use of cascading style sheets (CSS) on websites,
>whereby the look and feel of the html page is governed by the CSS file.  I would think that
>web clippers download the html page but do not download the CSS files (if they did I
>would imagine that they would be difficult to store and be used by the clipping
>programs, as each website will have their own CSS files and they would be likely to
>clash),  Look at an html page which accesses CSS files WITHOUT the CSS files and,
>indeed, you’d see the text in strange positions on the screen.
> >Also, CSS files add
>bloat to websites, so a clipper which downloaded them too would have to deal with that
>bloat.
> >Not sure how they’ll resolve that one (and it doesn’t bother me too much, I
>have to say). but I am intrigued if there is a web clipper that has managed to get around
>the issue.
> >Graham
> >
> > 

 


Back to topic list