Random Inno

Azure MSI: Connect Using PowerShell Or .NET?

Managed Service Identity (MSI) was introduced last year. Since then quite a few articles have been written about it:

  1. Use a Windows VM Managed Service Identity (MSI) to access Azure Key Vault:
  2. Azure SQL authentication with a Managed Service Identity

MSI gives your code an automatically managed identity for authenticating to Azure services, so that you can keep credentials out of your code.

To enable MSI, for most services it can be done using PowerShell / ARM templates / Portal.

Enabling and configuring MSI is usually performed in 3 steps

  1. Enable MSI for the source resource
  2. Grant the application spn access to another target resource
  3. Add MSI authentication to the code hosted on the source resource

Example using Azure functions can be found in github.

I was surprised though to find out that connecting to Azure SQL using PowerShell with MSI does not work when hosted in a function app.

Also included in the visual studio solution is the function app running in PowerShell. It fails to log into Azure SQL server with the following error:

Exception while executing function: Functions.dm_pdw_exec_sessions. Microsoft.Azure.WebJobs.Script: PowerShell script error. System.Management.Automation: Exception calling "Open" with "0" argument(s): "Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'.". .Net SqlClient Data Provider: Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON

At this point I suspect impersonation is working correctly with IIS hosting the function app

Azure, data warehouse, Microsoft, Random Inno

Azure Parallel Data Warehouse (PDW) Errors #1: cancelled transactions

Microsoft’s Azure data warehouse is a cloud hosted PaaS offering. Which implies compute and storage resources are managed by Microsoft / Azure fabric.

Scheduled and unscheduled maintenance activities occur which are supposed to be transparent, i.e. should not impact the platform. This is a false assumption.

On some occasions maintenance will impact the platform. For example back-end “node movements”, rotating compute / storage resources will cause errors on one of the compute / data nodes to bubble up into a transaction causing the transaction to enter a cancelled state.

Protect your transactions

The best way to protect a transaction when dealing with Azure DW, is to integrate retry logic into code.

$retry = 5
while ($retry -gt 0)

{
    try
    {
        # Connect here and perform SQL / ETL operation ...

        # Finally end the loop
        $retry = 0
    }
    catch
    {
        $retry =- 1
        if ($retry -gt 0)
        {
            Start-Sleep -Seconds 30
        }
    }
}
Quant

PrincessTenko’s delayed Stock transactions

Been getting questions about her transactions (pls post them on my blog as comments) Thought I’d clarify that the transaction section is delayed.

This means that there is a lag between the time the trade occurred  and when it is printed, i.e. a few periods ago PrincessTenko dabbled in that stock.

See her in action (and possibly here arch rival soon) here: quant.peterirojah.com

Random Inno

Forecasting Database Growth

Been thinking about the most accurate method for forecasting. More closely to my line of work, I’d like to forecast database growth over a specified period of time into the future?

The common method I have seen used is to simple use excel and plot a trend line from previously collected data, while this can be quite use-able in most scenarios, I don’t believe this captures everything such as seasonality. Seasonality here refers to the period data growth / purging that occurs. Most ignore this or are simply not aware but most databases will have some seasonal growth which needs to be added to the forecasting model.

The model used will be a 2 factor model of the form:

y = D + X + K

Where, y = the forecast database size, D = the seasonality variable, X = the stochastic variable with trend, and finally K is a constant of some sort.

Before we decide to identify seasonality and subsequently forecast, we need to gather some data (time series) In my case I’m going to be collecting hourly database size. Hourly simply because I want to retain a level of precision.

Lets use a simple Perl script (for portability, honestly anything that can remote exec a SQL query) which will connect to the database and execute some SQL.

See the code for data gathering in part 2.

dell

VMWare Server 2 Multicore Performance

Noticed that despite the fact that one of my VM’s (Win2k8, 64Bit) was configured with 2Gb Ram and 2 CPU cores the performance, even while idle, was not as expected. It was sluggish  and so unresponsive that the snap-in loader timed out all the time. This was very frustrating, I increased the RAM, Pagefile,  name it but still the same issue.

So I decided to allocate just one CPU core. The performance has increased significantly. I’m going to investigate what the issue was, maybe the guest OS’s were starving the host? I will delve deeper in another post.

Nerd Stuff, on.inno, Random Inno

Don’t want to learn Perl, Python, Shell … Ok, try On.Inno

There are people on this planet earth, wonderful as it is, that don’t want to learn about the glue that holds all things together. And by glue I’m referring to the workhorse scripts that never really get the attention they deserve:

  • Perl
  • Python
  • Shell

One cannot truly say they have experienced a production infrastructure with at least typing one command from any of the above.

Well for those who are not ready for this level can jump to the next level. I have something for you called on.inno.

on.inno is a high level scripting programming language that has 2 logical operators (IF …ELSE) … that’s it.

It’s simple, more to come.

Example:

EXECIFSTART<,>hostname<,>piroserv2<,>1

EXEC<,><!>This is a comment: embed any of thing here Perl| Pyton|Shell><,><,>1

EXECEND<,>echo if<,>if

Why did I create this simple language … I work in an environment that has different versions of OS’s (Linux, Windows, and Solaris), and not all of them have the same components. To save my self from having to try and say install Perl on all of them and simply drop some code there (ideal world) I simply created my own interpreted script to interface with the remote OS.