Enhancement for Empty Tabs

Hmmm. Who'd have thunk... yet another browser.

But you know what - it looks promising! Like people that made Opera what we used to love so much agreed that we were no longer getting what we loved and decided to do something about it :slight_smile:. Yay!

Thanks for pointing me to it!

Thanks again Sasa, I'm updating this post from the latest Vivaldi snapshot now :slight_smile:.

It's a bit slow and clunky to respond for me at this stage and some things just don't work right at all, aaaand it's still based on the Chrome engine. But it's promising! I'm going to keep following it's progress.

I get my mouse rocker gestures (hooray), my hotkey presses on links work as hoped, and I could easily import old Opera bookmarks! Two huge losses in the move to 'new' Opera.

:sunglasses:

My Xmas gift to you :slight_smile:.

I currently running both, as you said, Vivaldi is still beta, but every new version runs better.

Regarding the emptyness in empty tabs, I like the idea to fill it with life! It's the same emptyness we see when selecting a folder and looking over to the viewer pane.. nothing. What a good place to show some kind of folder summary, album/movie art, subfolders, description.txt or plain-text/html generated on the fly by a script or something.

So, "functional emptyness" in empty tabs or the viewerpane would be a welcome addition.

Btw: Another Opera+Vivaldi-User here! o) @Steje: Try the 32bit Vivaldi (in case you did not), it feels faster than the currently more experimental 64x build.

How do you all get empty tabs? I have a fav list of tabs (different for left and right lister) and of course none of them is empty :slight_smile:.

On the other side it would be cool placing fav folders/links and maybe commands into an empty tab like the browser do with sites (maybe also with "collections"/folders including a bunch of another folders/links like browser have). Would also be very touch-friendly.

The idea is not that bad.

Well, for me these empty tabs appear by just opening new tabs! o)
These tabs do not load a path automatically, as I find that useless and too slow at times (what happens is dependent on settings).

You could set new tabs to open a folder where you put shortcuts to folders or commands as you desire, and set it to open in thumbnails mode.

Oh I tried, not that bad actually! o)

After having created a flat list of shortcut files, I thought I could group them to folders, so I created folders like "network", "disk" and "favorite" and sorted the shortcuts into these. Now I wanted to save the folder format to be "flat view grouped", but forgot that flat-view is not a folder format, it cannot be used in newly opened tabs without scripts or embedded commands. Experiment stopped at this point, but quite an interesting approach.
I guess throwing in a script column and switching to the group-view could give a "My Computer" look-and-feel for this "home" folder. Thanks o).

If only scripts and embedded commands existed! :slight_smile:

Yeah, I guess that'd give a similar 'effective' result. Not the same from a maintenance standpoint (treating big thumbnail sized buttons just like buttons as far as Alt-Click editing etc), and you don't get any contextual awareness (highlight state, toggle state, icon control).

Was just an idea to make this 'Empty Tab' thing useful - since it's "there" and some browsers do the same to good effect. Was going for an App / modern Tiles sort of thing... and this was the only existing place in Opus is sort of made sense to do something like that.

Would much rather have OnCopy and OnCopyConflict events :slight_smile:...

And OnRename()! o)
We could make good use of OnRename() as well, which fires for successful rename operations and allows to "live"-mirror any rename to backup locations e.g. Being able to create a log would also be possible, which then could be applied to the backup data before the actual backup happens. This would prevent huge files from being transfered over and over again, just because their filename changed a bit.

Give's the same as sql transaction logs. Steje, I hope you don't mind! o)

LOL bud... Seems the 'Empty Tab' turning into a 'Utility Tab' idea won't bring enough value vs effort if you can just simulate it the way Leo suggested, so.. GAME ON!

Besides - I'm hijacking my OWN thread with plugs for other sh!t... so - PILE ON!

In fact - I'm intrigued by why you think we need an OnRename event. Couldn't we just write our own script command to entirely replace the current RAW Rename command - and handle whatever you want in there?

I inline rename a lot or use the advanced rename tool for a lot of files at once, unfortunately there's no way to get in between and find out what items were renamed how. So no, a script command does not help unless I overlooked something, then please let me know! o)

Ah... yeah - the Advanced rename dialog is a great reason NOT to try to replace the built-in Rename. That's alot of work to redo that GPSoft already did really well. For similar reasons - I'm not inclined to try and write an entire Copy proxy either...

Interesting use-case. If you were to write such a thing against such a theoretical Event - I'd very likely use it. I have a series of backup drives that I copy the same content to. I'm just not interested in backup/sync utilities to keep them consistent. I handle it myself, and periodically diff individual folders against my portable drives to get the backup copies updated. It's not very efficient - but it doesn't take THAT long and I find 'looking' at the differences reacclimates me with my content and what I was working on while on the road.

This fairly often involves a bunch of renamed content, that takes way longer to copy to a series of backup drives than the time it takes to sniff out the differences. Having a log like you described to reapply to the backup set would cut down on most of the time spent simply reconciling differences (although again, it doesn't take a really long time - especially if I'm maintaining it frequently).

+1

The file log already exists and can be configured to list individual files from batch renames, if I remember correctly.

But are there not also changes you make to files from outside of Opus? I can't imagine a scenario where you wouldn't still end up having to compare the folders the same way you are doing now.

Yeah - you're right of course. But for "me"... that sort of thing wouldn't be intended to prevent the need for some type of manual intervention, rather than to just make shorter work of a task that I've already decided to handle by manually syncing up my content to the backup drives.

Also, when working on ONE of the backup sets WHILE connected to the others - I'd probably like for renames under one of them to be echoed on the backup data sets in the other 2 backups, so... REAL-time mirroring of rename operations as opposed to replaying a series of events later from some log recording.

Anyhow - just my two cents on tbones suggestion... it'd be something I'd likely use for myself in a few scenarios.

Sounds like a lot of error-prone, manual effort to avoid finding a decent backup tool. OTOH, I've looked for decent backup tools and know your pain. :slight_smile: They seem as rare as decent database tools/APIs, and even more inflexible if you don't happen to want to use them exactly as their authors wanted to back things up. The you find a good one and it seems to get taken over by management who run it into the ground (mentioning no product names :slight_smile:).

This "mirrored" rename/move thing (wether realtime or not) is for huge files and folders in the first place. An ever growing collection of data get's a pain in the butt sooner or later, especially with HD and 4k video files now.

Currently a rename/move on some of these files or folders containing video/photo/music related data, yields hours of additional backup-time, since you cannot repeat all the changes manually in the backup-locations. I don't know if many people backup their 5TB+ disks, but I do (it twice) and given a steady 50MB/s transfer speed across your LAN, it takes 26,9 hours to fill/update a 5TB hard disk.

The last full backup I ran took around 16 hours, mainly because of renaming or moving around of data (not adding much new). Watch the timestamps of these log files, backup started at around 1:30 in the morning and ended around 18:00 in the evening, just because it needed to copy all those huge files over again, since they were not renamed in the backup.


Yeah - I started my IT career in Backup/Recovery, though in the enterprise rather than consumer/desktop space. I have a whole other level of hatred and disrespect for desktop backup applications because of the demands I was always used to having to satisfy in the datacenter. Sync-back and all of those other backup apps for home users are things I actually never spent much time developing specific issues with because I dismissed them REALLY fast based on quick perception of just the sort of things you mentioned... grrrr.

I don't think adopting tbones suggestion would be particularly error prone though... in a script, I can make sure the file being renamed in the source does in fact exist in the backups, and can even check to make sure it's got the same date and size and other checks to make sure I'm auto-mirror-renaming the intended things, etc. And outside of that, I already 'prefer' to manage my the syncing of my backup datasets manually 'anyway'. So even without some 'help' such as this sort of thing could provide, I'd still be doing the manual way like I am now.

But I might do some other interesting things as well. The rename dialog isn't hard to call up of course, but I often find myself making the same pattern of renames multiple times for related files just by cursoring down the file list because I don't feel like manually typing in the patterns and such. I might do something with an OnRename() event like detecting if other files in a series exist in teh same folder as somethign I'm INLINE renaming, and then prompt me to approve a similar auto-detected pattern rename for the other files... and other esoteric things.

I like automation - but often prefer there to be some intelligent 'prompts' along with kicking off said automation, and one of the things I REALLY like about how you guys implemented the event handling aspect of the scripting features is that we can do things as normal, and apply some intelligence during and after executing something to modify and customize the outcome. This as opposed to having to manually select an alternate way of doign an action (qualifier key, which some of us are running out of :slight_smile:, or even yet ANOTHER button on my MANY toolbars). What you've done lends very well to contextual automation... and I think rename, copy and copy conflicts have recently emerged as some areas where we'd like to play :slight_smile:...

Getting back to the OnRename()/OnMove() event-thing.

Over the last weeks I created "FOPTracker", a little filesystem watching framework/tool. It's able to track filesystem changes like create/delete/rename file/folder etc. Special feature #1, it's modular (plugins), you can hook your own logic into it. Special feature #2, it's able to detect move-operations, which is something I have not seen before. It's not magic, FOPTracker simply associates delete and create operations within a small timespan, if file-names are equal, voila, there's the move.

A DO plugin is included, watch filesystem changes in the DO script console or react to them by modifiying the belonging script command. The command will be triggered with all the information required to mirror events to another lister/location on the fly, with the commands and tools DO has to offer. No need to set DOs installation path anywhere, it is auto-detected.

By default, FOPTracker will create powershell-scripts for each tracked location (can be disabled). These scripts look like logfiles, but they run as powershell-scripts as well. When run, they redo and repeat all filesystem activities in the location of choice. I use it to mirror all delete/rename/move operations to my backup locations before the actual backup runs. This saves tons of time after heavy renaming or moving of big files and folders, since these don't need to be deleted from the backup, to be copied to the backup again in a different place a few seconds later.

You can create ms-dos batch log/redo-files as well, just notice that this is not tested as heavy as the powershell logging. The tracker can be run on "non-existing" drives and locations, it will just sit and wait until you mount your drives or path to look at. You can eject/remove the tracked location anytime as well, FOPTracker will pause. You can track multiple locations in parallel with one FOPTracker instance (the powershell memory footprint is huge, so we save some hundreds mb of ram compared to running multiple instances). There are bunch of parameters to finetune the tracking to specific files/folders or exclude specific items by type and size.

Requirements:

  • Powershell v3+ (Win7 comes with v2, download: microsoft.com/en-us/downloa ... x?id=34595)
  • FOPTracker-ScriptCommand and DOpus v11 (optional)
  • Since this is "fresh" software, don't run it on expensive space shuttles
  • Admin elevation maybe?

FOPTracker running and waiting for activity.. (you don't need the window opened)



DOpus script console, showing incoming FOPTracker events..


Demonstrational redo/repeat powershell script/logfile..


Download v0.2:
FOPTracker.zip (18.2 KB)
Give it a go, I'm quite happy with it! o)
Uh, this offtopic-post got bigger than I intended, sorry! o)