Category Archives: Uncategorized

PsConf.Eu – a quick review

This was my second trip to Hannover for PsConf.EU, a 4 day conference covering every corner of PowerShell.

Sessions

Let’s start with the most important part of a conference, the content. Just a quick scan of the Agenda gave a hint of the quality lined up. With sessions from the PowerShell Dev team, and deep dives from community maestros there was plenty of highly technical content, but mixed up with some easier intros to new topics.

For me some of the highlights were getting a first view of some of the new features coming out from the DSC team for the LCM previews straight from the horse’s mouth (well, <a href=’https://social.msdn.microsoft.com/profile/Michael+Greene>Michael Greene the team leader), more details here. And the look at Pester internals from Jakub Jares (b | t) was good as well, now feel I know a lot more about the underpinning, and the reasons behind some of the ways you have to use it make much more sense.

No bad sessions from my point of view. A few issues with demos. Just from my point as a speaker, if you’re going to run a fragile multi machine demo, record a good version ahead of time. While it’s always good to do a demo live, with those setups once it’s gone wrong its’s very hard to recover.

The days run long, 08:30 – 20:00 on the wednesday with the 3 hour workshops at the end. For me this a bonus. When I’m at a conference I like to concentrate on the learning. Some conferences I’ve been to it’s felt as those you spend almost as much time at lunch or coffee breaks as you do in the sessions. So while there were long days it felt worth it.

The venue is well lit, the AV is great, even from the back of the room the screens were clear and the PA carrried the speaker right back. The rooms were closer together this year, and easier to navigate between. And despite the temperature being a step up from the UK the rooms were kept a pleasant but not overly cold temperature. Plenty of food and drink, the lunch menus weren’t holding anything back.

Community

I’m lucky enough to know quite a few PowerShell community members already, but met plenty more at PSConf.EU. Everyone’s there for the same reason so it was always easy to spark up a conversation with someone, whether sat next to them waiting for a session or in the queue for lunch. Was good to see the number of PS UserGroups that were teased out over a lunchtime breakout session, hopefully it’ll be up on the web shortly.

Party

Tuesday night was party night this year. Back to the Hannover Zoo for an evening of catching up and making new friends. The boat is pushed out with the venue, the demonstration from the Zoo (this year Polar Bear feeding), free food and beer. Don’t think anyone went hungry or thirsty that night.

Logistics

From the UK this is a pretty easy conference to get to. From Nottingham I took the train down to Heathrow, a 1 hour hop over with British Airways and a 20 minute train journey saw me in the centre of Hannover. Taxis are about ¢12 from the station to the Conference hotel, but as I’d spent a lot of time sitting down I decided to take the 25 minute walk across.

Accomodation

This year I stayed in the conference hotel the Congress Hotel Am StadtPark next door to the main conference venue. Good hotel, room was a decent size and had all the usual features. The bar was reasonably priced and usually full of PowerShell geeks if you wanted a chat over beers. Restaurant was good, but not cheap. The only downside was flaky WiFi, but that improved on Saturday once all the phones, tablets and laptops from the conference had left for some reason…..

Another great year from Tobias and his team. They’ve announced the dates for next year, 4th-7th June 2019 and they’ve been written into the calendar with ink as I’ll definitely be heading back.

Changes to SQL Relay session selection process 2018

This year the SQL Relay committee has decided to make a couple of changes to our session selection procedure.

Our aim this year is to be as inclusive and transparent on our process as we can manage. To this end, this post is going to lay out our aims and how we plan to achieve them.

We want everyone to feel they can submit without worrying that their submission will be overlooked due to Gender, Disability, Availability to do the whole tour or any other reason you may be concerned about.

To do this we are moving to a blind selection process. I (Stuart) will be the only member of the committee who can see the list of submissions. At the end of the Call for Sessions I will release only the session abstracts to the rest of the committee members, who will then choose the sessions based only on the abstracts.

Then, should we have openings due to people’s availability we will go through again to fill in any holes in the agenda.

If you require any special requirements then please add them to your submission. They won’t be used during the selection process, but will allow us and our venues to ensure we cover them
The Call for Papers is available here – SQL Relay 2018 CfP

If you are worried about any other reasons you may feel that your submission may be overlooked and want to talk them through, then please contact me (comments, Contact Form or on Twitter (@napalmgram). Anything you ask will be treated in confidence.

If you would like some help with or an independent review your abstract then please get in touch with Speaking Mentors where some of the best SQL Session submitters can help you out.

New year, New speaking dates

Lining up a few SQL Server Usergroups speaking sessions for the year already:

All sessions will be:

Indexing Nightmare – Cut Through the Clutter

Inherited a database with 30 indexes on every table? Has the vendor for your 3rd party app recommended more indexes over the years than you can remember? Got indexes that were added to fix a data load problem 4 years ago, but not sure if they’re still being used? Indexes are key to SQL Server performance, but like everything too much of a good thing is a bad thing. In this sessions we’ll look at how you can analyse your current indexes with the aim of consolidating them into useful ones, even removing some completely and how to improve the ones you’ve got left

Except for Southampton, where it’ll be:

Get on the Bus

As time goes by, more systems are crossing hosting boundaries (On Premises, Azure, multi-cloud provider, ISVs). We need a simple reliable mechanism to transfer data between these systems easily, quickly and reliably. Microsoft have taken their Message Bus technology and moved it to the cloud as Service Bus. This session will introduce you to this service and give examples of how internal databases can safely process data from cloud hosted applications without having to be exposed to the InterTubes. Examples are predominantly .Net C#, but aren’t complex!

Dates

Nottingham SQL Server Usergroup – 12th January 2017
(Also presenting will be Steph Middleton, talking about Building a Robust SSIS Solution)
(More details and registration here

Midlands/Birmingham SQL Server Usergroup – 19th January 2017
More Details and Registration here

SQL Surrey Server Usergroup (Guilford) – 20th February 2017
Link and details to be confirmed

Southampton SQL Server Usergroup – 1st March 2017
More details and registration here

Hope to see some of you there. And if there’s any other usergroups out there that are looking for speakers then let me know, have presentations on SQL, Powershell and general IT process to offer.

PowerShell module to automate restoring SQL Server backups

PowerShellI’ve long been a proponent of automatically restoring as many of your SQL Server backups as you can. There are many benefits of doing this, amongst them:

  • Checking your backups are working, I’ve seen people with backups they can’t restore
  • Knowing how long it will take to recover, your boss isn’t going to be happy if they’re still waiting 3 hours after you said the business would be back up and running
  • Having logs of having restore them, Auditors love this sort of thing
  • Peace of mind knowing that your restores are going to work when the brown stuff hits the reciprocating draught machine.

I’ve been using a collection of scripts and functions based on my previous blogs and presentations on the subjects. But they’ve never really been in a state that I felt comfortable sharing with the world.

Until now:
Continue reading

Day 7 of 31 days of SQL Server Backup and Restore using PowerShell: Asynchronous backups – Part 1

So far all of the backup scripts we’ve looked at having been firing off backups synchronously, that is each one happens after the previous one has completed. This is the same as happens with SQL Server Maintenance Plans.

Within PowerShell we have 2 ways of submitting multiple backup jobs at the same time, each have their own advantages and drawbacks.

Method 1 – SMO’s SQLBackupAsync method and PowerShell Events

SMO’s backup object offers it’s own SQLBackupAsync method which can be simply fired off. This can also be extended with PowerShell’s Event subsystem to allow you to capture the status

Method 2 – PowerShell Jobs:

PowerShell 3 introduced Jobs into it’s bag of tricks. These allow you to fire out a block of script, allow it to run in the background while doing other work and then pick up the results at a later point. Jobs are simpler to work with, and allow you to group elements of work that have to happen together (ie; if you have 2 dbs that must backup together or need to make sure a filesystem backup happens at the same time as the SQL Server backup.

Today we’ll look at Method 1, and then move onto Method 2 tomorrow.

So without further ado, here’s the basics of of an asynchronous backup script:

import-module "SQLPS" -DisableNameChecking 
$ServerName="Server1" 
$BackupBath="c:psbackups\" 
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database
$Backups=@()
$i=0 
foreach($Db in $SQLSvr.Databases | Where-Object {$_.Name -ne "tempdb"}){
	$btmp=@{}
	$BackupConn= New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($ServerName)
	$Backup= New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
	$DeviceType= [Microsoft.SqlServer.Management.Smo.DeviceType"]::File
    $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $Backup.BackupSetDescription = "Full Back of "+$db.Name
    $Backup.database = $Db.name
    $BackupDevice = New-Object -TypeName Microsoft.SQLServer.Management.Smo.BackupDeviceItem($BackupName,$DeviceType)
    $Backup.Devices.Add($BackupDevice)
    $btmp["backup"]= $Backup
    $btmp["connection"] = $BackupConn
    $backups += $btmp

    $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])
    $i++
}

This script will perform a full database backup of each databases on the specified server except for TempDB. But, it’s the PowerShell equivalent of throwing something and not being worried about where it lands or if it shatters.

Working through the script:

It starts off normally with loading the SQLPS module, setting the now usual variables, and creating the familiar Database and Server Connection objects.

Then we have this line

$Backups=@() 

For those of you not familiar with PowerShell, this is creating a new Array variable called $Backups, this because for Asynchronous backups we need to hold a connection object for EACH backup. Up till now, as we’ve been submitting the backups synchronously they are are queued up and submitted one by one, so they can use the same connection. As we’re submitting these asynchronously they need their own connection. The simplest way to keep track of which connection goes with which backup is to track them all in a multidimensional array, and we build the first dimension here.

My old friend the counter $i is also set to 0, this will be used to keep track of our array elements.

Then we drop into our normal ForEach loop through the databases on our server.

First we build a temporary hash $btmp which will hold the backup information before we add it to $Backups. If you’ve not met a PowerShell Hash before, it’s an associative array, so it holds data like:

$hash["author"]="stuart"
$hash["topic"]="PowerShell"

And since this is PowerShell, each Value can in fact be an object. Which is exactly what we’re going to use it for

We build a new connection object for our backups called $BackupConn, and then a Backup object as normal. We then add these to our $btmp hash like follows:

$btmp["backup"]= $Backup
$btmp["connection"] = $BackupConn

And then we append this to our $Backups array.

As demonstrated in the next line when we actually submit the backup:

$backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])

We can reference our Backup and it’s specific connection by referencing the index of our $Backup array, and then referencing the contained hash.

If you’re backing up enough databases with enough data you’ll have time to switch to SQL Server Management Studio and run sp_who2 to see all the connections and backup processes running.

All the backups will now be submitted on their own connection and SQL Server will process them whichever order it wants. That last point is very important to remember. When running backups asynchronously there is no guarantee in which order the backups the will be run, so if you need to ensure a certain order you’ll either need to stick to running them synchronously or look at the PowerShell Job methods in tomorrows post which allows you to batch them up.

Now, this is all great, but as mentioned at the start of this blog we’re just throwing the backups at SQL Server, and have no way of knowing what’s completed. So we’ll know add some extra code into to the script so we can get this information back.

To do this we’re going to be using PowerShell Events. Many PowerShell methods expose events which can fire at certain points during the execution of the method. Your script can poll for this event, and then take appropriate action when they fire.

For this script we’re going to register the following 2 events:

    Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$(db.name)"  -MessageData "$($db.name)-$i" | Out-Null
    Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($db.name)"  -MessageData "$($db.name)-$i" | out-null

These events fire when our Backup completes, or when it wants to return some other informational message (ie; an error). In each case we register the Event against the specific backup object using our multidimensional array $backups, specify the event we want to catch, give this particular event an identifier to distinguish it from all other events and then give it a ‘payload’ which we can get later.

We then need to poll the events till we get one of ours. There are a number of ways this loop can be run, either as a continuous while loop or from a Timer which fires a regular intervals.

In this example we’re going to add a “complete” value to our hash which we’ll initially set to 0. As each backup completes (successfully or unsuccessfully) we’ll set the value to 1. If we sum all the “complete” values in our $backups array, when it’s equal to the number of entries in the array then we know that all the backups have finished and we can exit the loop.

While in the loop we use the Wait-Event cmdlet to catch any events. We use Tee-Object to get the information from the event but still allow us to pass it to Remove-Event, we need to do that to remove the Event from the queue to prevent processing it repeatedly.

Then we’ll split the payload into an array index and the Database Name, pull the ExecutionStatus from the Backup Object firing the event and get the Error message (if anything). Then using a case statement to catch the possible event statuses we build up a $output to record the information, and we mark the appropriate records as completed.

After the loop exits we clear all Event registrations, and output the message we built up:

Import-Module "SQLPS" -DisableNameChecking

$ServerName = "Server1"
$BackupBath = "c:\psbackups\"

$SQLSvr = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
$Db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database

$Backups = @()
$i=0

foreach ($Db in $SQLSvr.Databases | Where-Object {$_.Name -neq "Tempdb"}){
    $btmp = @{}
    $BackupConn = New-Object -TypeName  Microsoft.SQLServer.Management.Smo.Server($ServerName)
    $Backup = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Backup
    $BackupName = $BackupPath+"\"+$Db.Name+"_"+[DateTime]::Now.ToString("yyyyMMdd_HHmmss")+".bak"
    $Backup.BackupSetDescription = "Full Back of "+$Db.Name
    $Backup.database = $Db.Name
    $Backup.Devices.AddDevice($BackupName,'File')
    $btmp["backup"]= $Backup
    $btmp["connection"] = $BackupConn
    $backups += $btmp

   Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Complete" -SourceIdentifier "asyncbkup-complete-$($Db.Name)"  -MessageData "$($Db.name)-$i" | Out-Null
   Register-ObjectEvent -InputObject $backups[$i]["backup"] -EventName "Information" -SourceIdentifier "asyncbkup-info-$($Db.Name)"  -MessageData "$($Db.Name)-$i" | out-null

    $backups[$i]["backup"].SqlBackupAsync($backups[$i]["connection"])
    $i++
}

$output = ""

while(($backups[0..$backups.length].Complete | Measure -sum).sum -lt $backups.Length){
    Wait-Event | Tee-Object -variable event | Remove-Event

    $ErrorText = $event.SourceEventArgs.Error.Message
    $DbName,$i = $event.MessageData.Split('-')

    $status = $event.SourceArgs[0].AsyncStatus.ExecutionStatus

    if($backups[$i]["Complete"] -ne 1){
        $now = get-date
    	switch ($status){
    	    "Succeeded" {
    		$output += "`n $DbName Completed at $now"
    		$backups[$i]["complete"] = 1
                break;}
    	    "Failed" {
    		$output += "`n $DbName Failed at $now with $ErrorText"
    		$backups[$i]["complete"]=1
    		break;}
    	    "InProgress"{
                $output += "`n $DbName In Progress at $now"
    		break;}
    	    "Inactive" {
    		$output += "`n $DbName Inactive at $now"
                break;}

    	}
    }
}

Get-EventSubscriber | Unregister-Event
write-output $output

As an example of the output from this script, here’s a sample from one of my test boxes where I’d marked one DB as offline:

InternetSales Failed at 09/09/2013 21:32:08 with Database 'InternetSales' cannot be opened because it is offline.
 dbgrow In Progress at 09/09/2013 21:32:09
 dbgrow In Progress at 09/09/2013 21:32:09
 AWDataWarehouse In Progress at 09/09/2013 21:32:09
 AWDataWarehouse In Progress at 09/09/2013 21:32:09
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 HumanResources In Progress at 09/09/2013 21:32:10
 dbgrow Completed at 09/09/2013 21:32:10
 fg_test In Progress at 09/09/2013 21:32:10
 HumanResources In Progress at 09/09/2013 21:32:10
 msdb In Progress at 09/09/2013 21:32:11
 model In Progress at 09/09/2013 21:32:11
 AWDataWarehouse Completed at 09/09/2013 21:32:11
 modellock2 In Progress at 09/09/2013 21:32:11
 modellock In Progress at 09/09/2013 21:32:11
 model In Progress at 09/09/2013 21:32:11
 msdb In Progress at 09/09/2013 21:32:11
 psrestore In Progress at 09/09/2013 21:32:12
 modellock In Progress at 09/09/2013 21:32:12
 modellock2 In Progress at 09/09/2013 21:32:12
 ResellerSales In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 psrestore In Progress at 09/09/2013 21:32:12
 SomeDB In Progress at 09/09/2013 21:32:12
 ResellerSales In Progress at 09/09/2013 21:32:12
 master In Progress at 09/09/2013 21:32:13
 HumanResources Completed at 09/09/2013 21:32:13
 restoretime In Progress at 09/09/2013 21:32:13
 restoredt In Progress at 09/09/2013 21:32:13
 fg_test Completed at 09/09/2013 21:32:13
 master In Progress at 09/09/2013 21:32:13
 Staging In Progress at 09/09/2013 21:32:13
 restoretime In Progress at 09/09/2013 21:32:13
 Staging In Progress at 09/09/2013 21:32:13
 restoredt In Progress at 09/09/2013 21:32:13
 psrestore Completed at 09/09/2013 21:32:13
 msdb Completed at 09/09/2013 21:32:13
 model Completed at 09/09/2013 21:32:13
 ResellerSales Completed at 09/09/2013 21:32:13
 modellock Completed at 09/09/2013 21:32:13
 modellock2 Completed at 09/09/2013 21:32:13
 test1 In Progress at 09/09/2013 21:32:13
 test1 In Progress at 09/09/2013 21:32:13
 Products In Progress at 09/09/2013 21:32:13
 TSQL2012 In Progress at 09/09/2013 21:32:13
 Products In Progress at 09/09/2013 21:32:13
 TSQL2012 In Progress at 09/09/2013 21:32:13
 master Completed at 09/09/2013 21:32:13
 Staging Completed at 09/09/2013 21:32:13
 SomeDB Completed at 09/09/2013 21:32:13
 restoretime Completed at 09/09/2013 21:32:13
 restoredt Completed at 09/09/2013 21:32:13
 test1 Completed at 09/09/2013 21:32:13
 Products Completed at 09/09/2013 21:32:13
 TSQL2012 Completed at 09/09/2013 21:32:13

Showing that we have multiple backups happening at once, with the finish order mostly being determined by the size of the Database and the speed of my test rig’s hard drive.

And if you switch to SSMS while the script is running and look at connectiong/processes you’ll see all the backups working and queueing as well.

PowerShell Events are a powerful topic, and can be used in lots of other SQL Server scripting tasks. I’ve only just covered the tip of them here, a pair of good articles on them are :

Use Asynchronous Event Handling in PowerShell

Manage Event Subscriptions with PowerShell

which are well worth a read.

Tomorrow we’ll take a look at the PowerShell Job way of doing this.

This post is part of a series posted between 1st September 2013 and 3rd October 2013, an index for the series is availablehere.

Powershell to move Windows Roles and Features between servers

I’ve been working a lot with building some test and development environments for systems that didn’t have them. And as those ‘unimportant’ features have been missed out, it wasn’t too suprising to find that the Build documentation for the servers was less than stellar.

So, how could I make sure that the test and dev boxes had the same Windows Roles and Features without wasting too much time. Well, PowerShell is quickly becoming my default answer to anything like this. And it turns out there’s a nice 1 liner you can put together to do it as well (well, 1 liner if you’ve already got the module imported 🙂 )

 

Import-Module ServerManager
Get-WindowsFeature | Where-Object {$_.Installed -eq $True} | Install-WindowsFeature -computername devbox

First up we import the ServerManager module, it’s loaded by default if you’re running Windows Server 2012 but it’s always good practice to make sure you’ve got it.

Get-WindowsFeature returns a list of all the available Roles and Features on the server, we filter that with Where-Object so we only have those roles and features that are installed. Then we pass the whole lot to Install-WindowsFeature and tell it to install it onto the remote box.

And the nice feature of Install-WindowsFeature is that it will only install anything that’s missing, so this snippet can be used to check for missing sub features

Tagged

Add-WindowsFeature cmdlet fails with the error code 0x800f0922 adding IIS role/feature

add-windowsfeature-error

I was working on trying install SSI (Server Side Includes) on an Windows 2012 IIS8 box, set up by someone else. Doing everything the usual way, I was using PowerShell (as I do for all server admining this days, just so much faster) like so:

import-module ServerManager
Add-WindowsFeature web-includes

Only to get the wonderful response shown above in the screenshot. Not very informative really. So I double checked the syntax, but that looked fine. Tried going through GUI to do it, same error just took longer to pop up. Spun up a Windows 2012 VM I had kicking around on my laptop and ran the same snippet, success. So what could it be. Googling showed some MS notes that didn’t offer much information that seemed relevant to this particular case, and in once case seemed to suggest a complete rebuild as the error was due to a corrupted system/iis config issue.

Not fancying a rebuild I thought I’d delve deeper into the IIS configuration to see if there was anything that could be manually fixed. Doing this I discovered that the box had once been part of a shared config setup across a number of boxes, and was now the only one left. BUT, it was still set up with a shared config folder. That was quickly removed and the config exported locally. Reran the Add-WindowsFeature, and this time everything added itself properly.

So it looks quite likely that the error is a catch all (for IIS at least) when the process can’t read or get a lock on the IIS config for one reason or another. So if it crops up checking through the config files for anything funny is probably a very good first step.

Tagged

Learning from more than just the content

While picking up new technical skills is the main reason for attending SQL Server events like User Groups and SQL Saturdays, there’s also another very good reason for attending. Watching some of the presenters just present is of great value.

All the speakers are obviously technically very good, but what really separates them from the other technical people present is their presentation styles. There’s a wide range on display, from the ultra focused business style talking about career enhancement to the enthusiastic geek motivator who’s all about getting you fired up about the newest tech.

But even with the difference overall feel, there’s still some common points that I aim to always incorporate into my presentations:

  • Be prepared. They make it look pretty effortless as they turn up and go. But that’s down to have rehearsed and practiced the material, and knowing they have a backup should anything go wrong
  • Engage with the audience. There isn’t a feeling of “you’ll sit there and watch and listen for the next 45 minutes”. They try to get the audience to interact with the material and think about how it would work for them
  • Time Management. They have the time built in to answer questions on the fly, but also know where they are in the presentation so they can politely say ‘talk later’ if they’re running behind. And from watching them do the same presentation to multiple audiences they also know how to extend it if noone’s asking questions.
  • Clear, easy to read materials. Slides aren’t cluttered with logos or fancy backgrounds. And occasionally handouts for frequently referenced info, this is a great idea, and makes it easy to take something away at the end.

So that’s just 4 quick and easy things to incorporate into your own presenting then! But in my opinion they all tie together, and the main thing you can do to bring them out is PRACTICE.

I’m sure my cats know more about SQL Server then any other cats on the planet, but they do insist on sitting there watching me present to no one with slides on the wall of the home office. Luckily my wife will also put up with me doing it, and in fact will often be a source of good feedback. As a non technical person, if she feels she’s sort of following the plot I’ve got the level right.

And it’s very rare that I’ll give the same presentation twice. Each time I present something I try to get feedback (and really appreciate it being given) and then use that to feed back for the next presentation.

Tagged , ,

What I want from a Google Reader replacement

Like a lot of other people I wasn’t best pleased to see the announcement on Thursday that Google were killing off Google Reader. I’ve got it set up just right for quickly sorting through all the blogs I follow to find the stuff I’m interested in.

I follow a lot of different blogs on a number of different subject, so there’s often a couple of hundred posts to get through, which I can currently do pretty quickly. So to be a credible replacement for Google Read I’m looking for a service/app that does the following:

  • Keyboard shortcuts. This has to be there, and there should be a shortcut for every major operation. This is one of the greatest time savers I’ve found with Reader. I can quickly ‘n’ my way through entire folders, ‘s’ing the ones I want to read when I’ve got time and ‘e’ing others to people who’d appreciate them
  • Should be open to any client. Over the years I’ve tried various mobile clients to reader to get the best one. So I don’t want to be lumbered with a single client option
  • Clean simple interface. I really don’t want anything that tries to look like a newspaper or insists on silly transitions between articles. If you want to offer that, great, but let me turn it off and just have a plain view
  • Integration with IFTTT – OK this may be more down to IFTTT picking who it’s supporting. But I use this a lot. I take articles and then let IFTTT do the sharing to various streams for me. Quick and simples.
  • Folders, really this is basic stuff, but some of the suggested apps I’ve seen don’t have them
  • Don’t push the ‘Social’ aspects too hard. I just want the feeds aggregating. Not interested in what’s hot or what others are reading. If it’s genuinely good on the subject it’ll filter through the normal channels
  • No ads. I’m happy to fork out a few sheckles a month to keep a service running, and if I do then keep the ads out of the way. Or if you want to use me as the product (like Google do/did) then that’s fine, but no injecting junk into my feeds.
  • Easy subscription mechanism. Getting new feeds into Reader was nice and simple, so let’s have that carried on

So that’s the wants. Does anything meet that at the moment? Well feedly seems OK, but falls down on an overly fussy UI, not enough keyboard shortcuts, you have to use their mobile client and there’s no IFTTT support (yet).

I’ve had a look at fever which seems to offer lots of features. You have to host it yourself, but with a couple of domains to choose from that’s not a problem. Doesn’t do everything, but having source code means I could probably tinker in the bits I want.

Reader’s not disappearing till June, so hopefully in the next 3 months we’ll see some of the existing products upping their game, and some new players appearing. So for the meantime I’m going to carry on as I was, but keeping an eye out for the next ones to follow.