Health Care Analytics & Joining the Dots.

After a series of obscure posts it’s time to join the dots. I feel compelled to do this after a recent post leached it’s way into my LinkedIn feed. It was most probably user error on my part but there’s no getting the genie back in the bottle now.

These posts are little more than engineering notes for me and my small team. They’re solutions that work for our use cases and environment. They were not intended to be packaged for public consumption (hence the informal language and sloppy grammar).

So, I explained how to coerce Microsoft SQL Server 2012 into exporting JSON and setting it up as a Stored Procedure or Agent job. I then showed how to access it via PowerShell or PHP.

Ok, not rocket science. No big deal. So what?

Well, our use case is Instrumentation for Clinical Decision Support. One example is this Sankey Diagram illustrating in near real time, the flow of patients through an integrated health system. These are pretty standard techniques in most industries. They’re very uncommon in US Health Care and work like this sets my team apart from our peers. The only people doing anything like this (that I’m aware of) is IBM with their Patient Care & Insights product.

Capture

Largish Hospitals and Health Systems tend to spend seven figures buying BI platforms to integrate with their EHR/EMR. I’m fortunate to work for a health system that realizes that 80% of such functionality doesn’t fit any existing use case. They’re offering to support decisions our clinicians are not currently trying to make. They’re offering to answer questions we haven’t even thought of. While they may provide insight, we’re not yet equipped to handle or respond to such insight. Our senior clinical leaders are the first to say, “Cool, but so what?”.

The clinical decisions they’re asking me to support can be answered easily with some basic TSQL. I can communicate answers very simply and inexpensively with SQL Server Reporting Services and D3.js.

Learning Data Driven Documents (D3.js) is proving to be quite some journey. It’s one of the few tools that makes data visually beautiful by default. It offers Extrinsic Rewards over and above the Intrinsic Value of our analysis and synthesis.

Earlier posts have shown how to do some of this with SSRS. The next few posts will be recipes and tuts explaining how to to do some of what we’re doing with D3 in a Healthcare Setting.

Connecting to MS SQL Server with PHP for JSON Example

Following on from a previous post about coercing MSSQL Server 2012 to produce JSON, here is a short php script to deliver said JSON to a JavaScript function (use it in the callback).


<?php
$serverName = "(local)";
$uid = "myUserId";
$pwd = "myPassword";
$connectionInfo = array( "UID"=>$uid,
                         "PWD"=>$pwd,
                         "Database"=>"myTestDatabase");

/* Connect using SQL Server Authentication. */
$conn = sqlsrv_connect( $serverName, $connectionInfo);
if( $conn === false )
{
     echo "Unable to connect.</br>";
     die( print_r( sqlsrv_errors(), true));
}

/* Query SQL Server  */
$tsql = "EXEC SerializeJSON 'SELECT TOP 10 * FROM myTestTable'";

$stmt = sqlsrv_query( $conn, $tsql);
if( $stmt === false )
{
     echo "Error in executing query.</br>";
     die( print_r( sqlsrv_errors(), true));
}

/* Retrieve and display the results of the query. */
$row = sqlsrv_fetch_array($stmt);
echo $row[0]."</br>";

/* Free statement and connection resources. */
sqlsrv_free_stmt( $stmt);
sqlsrv_close( $conn);

?>

As it turns out, this method doesn’t seem to need a named instance specified (?!). Can’t explain that.

Setting up Remote Access to SQL Server 2012 is a royal PITA. Daniel_Walsenbach seems to have a good explanation here from way back in 2007. Stack Overflow has too many threads on this but would suggest starting hereMicrosoft’s  documentation on their php drivers is… dense, to put it politely.  PHP.net’s own documentation seems to have more user contributed questions than working examples. 

Still, this seems to work for me and hopefully for you too.

EDIT: TSQL to grant remote access to a MS SQL Server table.


USE myTableName ;
GO
EXEC sp_configure 'remote access', 0 ;
GO
RECONFIGURE ;
GO

Example SQL Server TSQL Stored Procedure to produce JSON

With no native support for JSON in MS SQL Server 2012, hacks and work-arounds abound.

One of the best I’ve come across is a tidy little stored procedure from from Jamin Quimby.

It works like a treat and seems to produce JSON that validates cleanly every time.

SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE PROCEDURE [dbo].[SerializeJSON] (@ParameterSQL AS VARCHAR(MAX))
AS
BEGIN
	DECLARE @SQL NVARCHAR(MAX)
	DECLARE @XMLString VARCHAR(MAX)
	DECLARE @XML XML
	DECLARE @Paramlist NVARCHAR(1000)

	SET @Paramlist = N'@XML XML OUTPUT'
	SET @SQL = 'WITH PrepareTable (XMLString)'
	SET @SQL = @SQL + 'AS('
	SET @SQL = @SQL + @ParameterSQL + 'FOR XML RAW,TYPE,ELEMENTS'
	SET @SQL = @SQL + ')'
	SET @SQL = @SQL + 'SELECT @XML=[XMLString]FROM[PrepareTable]'

	EXEC sp_executesql @SQL
		, @Paramlist
		, @XML = @XML OUTPUT

	SET @XMLString = CAST(@XML AS VARCHAR(MAX))

	DECLARE @JSON VARCHAR(MAX)
	DECLARE @Row VARCHAR(MAX)
	DECLARE @RowStart INT
	DECLARE @RowEnd INT
	DECLARE @FieldStart INT
	DECLARE @FieldEnd INT
	DECLARE @KEY VARCHAR(MAX)
	DECLARE @Value VARCHAR(MAX)
	DECLARE @StartRoot VARCHAR(100);

	SET @StartRoot = '<row>'

	DECLARE @EndRoot VARCHAR(100);

	SET @EndRoot = '</row>'

	DECLARE @StartField VARCHAR(100);

	SET @StartField = '<'

	DECLARE @EndField VARCHAR(100);

	SET @EndField = '>'
	SET @RowStart = CharIndex(@StartRoot, @XMLString, 0)
	SET @JSON = ''

	WHILE @RowStart > 0
	BEGIN
		SET @RowStart = @RowStart + Len(@StartRoot)
		SET @RowEnd = CharIndex(@EndRoot, @XMLString, @RowStart)
		SET @Row = SubString(@XMLString, @RowStart, @RowEnd - @RowStart)
		SET @JSON = @JSON + '{'
		-- for each row
		SET @FieldStart = CharIndex(@StartField, @Row, 0)

		WHILE @FieldStart > 0
		BEGIN
			-- parse node key
			SET @FieldStart = @FieldStart + Len(@StartField)
			SET @FieldEnd = CharIndex(@EndField, @Row, @FieldStart)
			SET @KEY = SubString(@Row, @FieldStart, @FieldEnd - @FieldStart)
			SET @JSON = @JSON + '"' + @KEY + '":'
			-- parse node value
			SET @FieldStart = @FieldEnd + 1
			SET @FieldEnd = CharIndex('</', @Row, @FieldStart)
			SET @Value = SubString(@Row, @FieldStart, @FieldEnd - @FieldStart)
			SET @JSON = @JSON + '"' + @Value + '",'
			SET @FieldStart = @FieldStart + Len(@StartField)
			SET @FieldEnd = CharIndex(@EndField, @Row, @FieldStart)
			SET @FieldStart = CharIndex(@StartField, @Row, @FieldEnd)
		END

		IF LEN(@JSON) > 0
			SET @JSON = SubString(@JSON, 0, LEN(@JSON))
		SET @JSON = @JSON + '},'
		--/ for each row
		SET @RowStart = CharIndex(@StartRoot, @XMLString, @RowEnd)
	END

	IF LEN(@JSON) > 0
		SET @JSON = SubString(@JSON, 0, LEN(@JSON))
	SET @JSON = '[' + @JSON + ']'

	SELECT @JSON
END

It takes any SQL query as a parameter.

EXEC [dbo].[SerializeJSON] "SELECT * FROM myTable;";

When you setup this stored procedure you’ll need to grant execute permissions. I have a small team of seasoned high performers (lucky me!), so I can afford to be more liberal with permissions. In which case I’ll grant execute permission to the SQL Server public role.


GRANT EXECUTE ON dbo.SerializeJSON  TO public

Now you could schedule this as a SQL Agent Job to dump your JSON file regularly to a wwwroot directory but in a shared development environment this is cumbersome. Especially if coders need perms to modify (or copy) the Agent Job on the SQL Server. Or if you’re using SQLExpress, you don’t even have Agents Jobs.

An alternative is to create a PowerShell script with promiscuous permissions.

Import-Module 'sqlps' -DisableNameChecking;

Invoke-Sqlcmd `
 -ServerInstance "myHostName\myInstanceName" `
 -Database "myDatabaseName" `
 -Query "EXEC [dbo].[SerializeJSON] `"SELECT TOP 100 * FROM myTableName`";";

Set permissions with this command in the PowerShell Console,

Set-ExecutionPolicy Unrestricted

Now this little .ps1 script can be copied, shared and edited with a text editor. You could use Windows Task Scheduler to run this periodically to refresh your test data.

Simple. Effective.

Link

SmartyStreets, the key to Choropleth Heat Maps

SmartyStreets

Recently I’ve been doing lots of map-making with D3.js. Specifically Choropleths.

Choropleth

If you find youself trying to clean up location data, you need to introduce yourself to the boys and girls at SmartyStreets. Their webservice has been a huge time saver when having to clean up addresses, geotag them and obtain corresponding US county and FIPS codes.

Their tech support is awesome. Props to Jeffrey, their Street Genius!