• Containers,  Microsoft

    Docker Oh Docker

    The  dark world of Containers and Images…oh my.  As I step my feet back into the Container World… and bring back the horror that is Hyper-V… I am reminded that my personally built Mage Dev PC does not really fully support the whole of Hyper-V- as least not how Docker Desktop wants to use it. Ya, it work fine on my ThinkPads (including my mini- Yoga X1)… but alas not the main desktop dev.  Oh woe!

    But wait… there is a way…. I ran across this blog entry Docker without Hyper-V … and it really works… The long and short of it…. two methods to the madness…

    Option 1 script:

    # Install Windows feature containers
    $restartNeeded = $false
    if (!(Get-WindowsOptionalFeature -FeatureName containers -Online).State -eq 'Enabled') {
        $restartNeeded = (Enable-WindowsOptionalFeature -FeatureName containers -Online).RestartNeeded
    }
    
    if (Get-Service docker -ErrorAction SilentlyContinue)
    {
        Stop-Service docker
    }
    
    # Download the zip file.
    $json = Invoke-WebRequest https://download.docker.com/components/engine/windows-server/index.json | ConvertFrom-Json
    $stableversion = $json.channels.stable.alias
    $version = $json.channels.$stableversion.version
    $url = $json.versions.$version.url
    $zipfile = Join-Path "$env:USERPROFILE\Downloads\" $json.versions.$version.url.Split('/')[-1]
    Invoke-WebRequest -UseBasicparsing -Outfile $zipfile -Uri $url
    
    # Extract the archive.
    Expand-Archive $zipfile -DestinationPath $Env:ProgramFiles -Force
    
    # Modify PATH to persist across sessions.
    $newPath = [Environment]::GetEnvironmentVariable("PATH",[EnvironmentVariableTarget]::Machine) + ";$env:ProgramFiles\docker"
    $splittedPath = $newPath -split ';'
    $cleanedPath = $splittedPath | Sort-Object -Unique
    $newPath = $cleanedPath -join ';'
    [Environment]::SetEnvironmentVariable("PATH", $newPath, [EnvironmentVariableTarget]::Machine)
    $env:path = $newPath
    
    # Register the Docker daemon as a service.
    if (!(Get-Service docker -ErrorAction SilentlyContinue)) {
      dockerd --exec-opt isolation=process --register-service
    }
    
    # Start the Docker service.
    if ($restartNeeded) {
        Write-Host 'A restart is needed to finish the installation' -ForegroundColor Green
        If ((Read-Host 'Do you want to restart now? [Y/N]') -eq 'Y') {
          Restart-Computer
        }
    } else {
        Start-Service docker
    }

    This works just fine… and Option 2 (use DockerMsftProvider)

     

    $paths = $env:psmodulePath.Split(';')
    $modulePath = Join-Path $paths[0] "DockerMsftProvider"
    if (!(Test-Path $modulePath)) {
      New-Item -Path $modulePath -ItemType Directory
    }
    $outfile = Join-Path $modulePath 'DockerMsftProvider.psm1'
    Invoke-WebRequest -UseBasicParsing -OutFile $outfile -Uri https://raw.githubusercontent.com/ajkauffmann/MicrosoftDockerProvider/master/DockerMsftProvider.psm1
    
    $outfile = Join-Path $modulePath 'DockerMsftProvider.psd1'
    Invoke-WebRequest -UseBasicParsing -OutFile $outfile https://raw.githubusercontent.com/ajkauffmann/MicrosoftDockerProvider/master/DockerMsftProvider.psd1
    
    Install-Package Docker -ProviderName DockerMsftProvider -Force

    Now Option 2 – don’t forget to do a Start-Service docker … or things will complain…

    Now I did change up the module path a bit to push docker up to my system module area instead of to my user’s doc path… and I did notice that Option 2 got a newer version of docker… Oh well you choose.

    And yes docker commands work… Kitematic works… and Portainer works…

    Btw if you want Portainer… 

    docker pull portainer/portainer
    docker run -d --restart always --name portainer --isolation process -h portainer -p 9000:9000 -v //./pipe/docker_engine://./pipe/docker_engine portainer/portainer

    There you go!

    ~ScottGeek….

     

  • Microsoft

    C# Full Timestamp

    Because I’m always having to look this up… a full (useful) timestamp should always look like this…

    YYYYMMDD HH:MM:SS.fff   => 20200730 14:58:00.000   

    string timeStamp = DateTime.Now.ToString("yyyyMMdd hh:mm:ss.fff tt");  //Example 20200730 03:02:39.591 PM
    string fileTimeStamp = DateTime.Now.ToString("yyyyMMdd_HHmmssfff");    //20200730_150535040
    string fileTimeStamp2 = DateTime.Now.ToString("yyyyMMdd_hhmmssffftt");// 20200730_030707929PM

     

    ~SG

     

  • ASP.NET Core,  Blazor,  Microsoft

    The Server Side Blazor and SignalR Timeout

    So in my Blazor travels I ran across one (or another) annoying thing. In this one we will talk about that Server -Timeout? Disconnect? Retry? Not really sure which one it is.
    This affects Blazor Server Side where SinalR is being used between the browser and server side code.image

     

     

    I came across some Javescript that can go into the _Host.cshtml file.

    <script>
        // Wait until a 'reload' button appears
        new MutationObserver((mutations, observer) => {
            if (document.querySelector('#components-reconnect-modal h5 a')) {
                // Now every 10 seconds, see if the server appears to be back, and if so, reload
                async function attemptReload() {
                    await fetch(''); // Check the server really is back
                    location.reload();
                }
                observer.disconnect();
                attemptReload();
                setInterval(attemptReload, 10000);
            }
        }).observe(document.body, { childList: true, subtree: true });
    </script>

    Not really a solution but let’s see. The script is basic and checks the connection and reloads the page as needed.

    There’s a full run down here ASP Core Issues. Another good resource is this docs page ASP.NET Core SignalR configuration.

    Happy Blazor-ing!

    ~ScottGeek

     

  • Blazor,  Microsoft

    Blazor: And the Identity Scaffolding Mess

    Ok, back around to the annoying bin. Let’s talk adding Identity to a Blazor Web Assembly app…

    Out of the box when you start with a Blazor WebAss project using Identity and of course ASP Core hosted (I’m using preview 2). You then decide that all of the hidden razor pages really need to be customized (don’t even get me started on Hidden razor pages). Well there is a way to do just that.

    You Add Scaffolded Item to the Server project and select Identity (nope not going to show you that- there’s plenty of how-to on the internet). I wait while you go figure out how-to do that……   Wait  wait wait wait….

    All right, now you have those missing identity pages  up in Areas/Identity/Pages/Account – and you have an app that compiles and run, right? Well…. NO you don’t. Because now your app is broken… And that’s where we start… Let’s fix it.

    The Fix –

      RegisterConfirmation.chtml.cs  (if during your scaffold you selected the RegisterConfirmation page to include in the Areas/Identity/Pages/Account)
        You will need to add the reference using Microsoft.AspNetCore.Identity; – this is missing from the code behind.

      The Scaffold also creates an another wwwroot folder in the .Server project – The app runs into conflicts with css etc… 
        All you need to do here is delete the wwwroot folder from the .Server project. The Blazor will come up normal.

    Now, there maybe other compile errors depending on the mix of .dot core you have, etc. You will just need to work through those.

    Another annoying thing that happens when you mix Blazor and these older razor identity pages… is how the identity system works when you logout. Normally one likes to have the app navigate back to the home page when one logs out. This is doable but not in the box on the template.

    This is easy fix….

    In your client project in the Pages folder, you have a Authentication.razor file:

    @page "/authentication/{action}"
    @using Microsoft.AspNetCore.Components.WebAssembly.Authentication
    <RemoteAuthenticatorView Action="@Action" />
    
    @code{
        [Parameter] public string Action { get; set; }
    }

    This is the default page… Let’s make a change…

    @page "/authentication/{action}"
    @using Microsoft.AspNetCore.Components.WebAssembly.Authentication
    @inject NavigationManager navMGR
    <RemoteAuthenticatorView Action="@Action">
        <LogOutSucceeded>
          @{navMGR.NavigateTo("/");}
        </LogOutSucceeded>
    </RemoteAuthenticatorView>
    
    @code{
        [Parameter] public string Action { get; set; }
    }

    We injected the NavigationManager, because we want to navigate to the home page. 
     One see that in the RemoteAuthenticationView we have some options… LogOutSucceeded is a component option that allows us to do something, when the user has logged out with no errors. 
     This is good place to put a NavigateTo Blazor command… and as we see we can force the app (from the box) to go to the home page.

    Neat… 

    Now yes, the errors and problems are annoying… they have been for the scaffold for awhile now… but hopefully as Blazor moves out of preview, these things will be fixed.

    ~ScottGeek   Happy Blazoring….

     

     

     

     

     

  • SQL Server

    Powershell Create of the SSISDB Catalog

    So when management gets this [insert explicative] idea to strip away my MSDN enterprise access, and I need to create an SSISDB catalog on my dev server- what the hell am I to do? Well, maybe not simple… I put on my PowerShell hack through hat. Shall we?

    In your favorite SQL instance (we’ll just use localhost for my examples) start with making sure you have CLR ENABLED first:

    EXEC sp_configure 'clr enabled', 1;  
    RECONFIGURE;  
    GO

    Do this in a query window in SSMS of course.

    Now the PowerShell:
     

    # Load the IntegrationServices Assembly  
    [Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices")  
    
    # Store the IntegrationServices Assembly namespace to avoid typing it every time  
    $ISNamespace = "Microsoft.SqlServer.Management.IntegrationServices"  
    
    Write-Host "Connecting to server ..."  
    
    # Create a connection to the server  
    $sqlConnectionString = "Data Source=localhost;Initial Catalog=master;Integrated Security=SSPI;"  
    $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $sqlConnectionString  
    
    # Create the Integration Services object  
    $integrationServices = New-Object $ISNamespace".IntegrationServices" $sqlConnection  
    
    # Provision a new SSIS Catalog  
    $catalog = New-Object $ISNamespace".Catalog" ($integrationServices, "SSISDB", "P@assword1")  
    $catalog.Create()

    The last part in “Provision a new SSIS Catalog” make sure you do a good password.

    And that’s it…. So why am I doing this with PowerShell? No “DIS” on PowerShell, but it seems that the version I install SQL Server 2016 has this thing about connecting to Azure SSIS with creating a SSISDB Catalog, and since I no longer have access to MSDN (you knew MSDN was in this somewhere) I had to hack my way around getting a catalog created… Now yes, I could have done the Azure SQL SSISDB connect bah bah (I do have an enterprise Azure accounts), but you what? This same management does not want to spend money on Azure… YA! An Azure SQL SSISDB has a cost. 

    So there you go… Got to love this jumping around not having the tools to get stuff done.

    ~ScottGeek

     

     

     

     

  • .Net,  .Net Core,  Microsoft,  Visual Studio 2019

    Dot Net Hell by any other name?

    Just when I think Dot Net Hell days are coming behind us… I run across the reality of the fact that those that created this in the first place are… well… still doing it?

     

    Take this little gem:

    public static async IAsyncEnumerable<string> GetContent(string[] urls)
    {
    
    }

    Yes, we now have IAsyncEnumerable… or do we.   In my little experiment, I have a .Net 4.8 console app with C#8.0 language stuff turned on. So, I wanted to mess about with the Async Enumerable…. But in my console app that IAsyncEnumerble goes all red line of crap on me.

    So what’s the deal?  Well it’s simple once you do the digging.   .Net 4.8 does not implement .Net Standard 2.1… Yeah, you guessed it… where the Async Enumerable lives!

    So yeah, I could use .Net Core 3.0 which works with C# 8.0 and .Net Standard 2.1. Or here’s a thought why not just do a .Net Standard 2.1 console app? Well, the latest version of VS 2019 does not have a .Net Standard 2.1 template listed… Ummm, I’m sensing something missing there.

    Welcome to the “New” Hell! 

    ~ScottGeek….

  • .Net Core,  Microsoft

    Dotnet Core Templates – The Revisit

    So a while back I penned an article (Dotnet new Magic), but now with the latest VS 2019 and Dotnet Core 3.x release… it’s time to revisit.

    In case we forgot – the command is:

    dotnet new -i “–The Template Reference Name Here–” 

    The list of templates can be found in the old place but also here – Dotnet New Templates

    What ones do I add:

    dotnet new –install “Microsoft.AspNetCore.SpaTemplates”
    dotnet new –install “Microsoft.Azure.WebJobs.ProjectTemplates”
    dotnet new –install “Microsoft.Azure.WebJobs.ItemTemplates”
    dotnet new –install “NUnit3.DotNetNew.Template”
    dotnet new –install “Microsoft.AspNetCore.Blazor.Templates”   **Some of these already exist, but this one adds the webassembly type.
    dotnet new –install “Microsoft.Azure.IoT.Edge.Module”
    dotnet new –install “Microsoft.Azure.IoT.Edge.Function”
    dotnet new –install “RaspberryPi.Template”

     

  • .Net,  Microsoft,  Visual Studio 2019

    Program Challenge using CSV and NetMQ Part 1.

    I did a recent program challenge. The requirements were not too complicated.

    Basically I was given a simple csv formatted file (csv, of course, being a comma delimited text file). The challenge was to take this csv and parse it into a POCO- POCO being Plain Old Class Object. I’m not sure why we need a acronym for that. POCO is just a data structure in the form of a Class Object. But I digress.

    Once the csv is parsed into a Class Object, I was then to make it available in a Request/Response using NetMQ. Now on the surface this might seem a lot to do. But not really. The key, as always, is to break down the problem into the workable bits.

    We start with what we know some tasks:

    A) Input is a csv and it must be read into a Class Object.
    B) Using Request/Response pattern the csv must output using NetMQ (this part relies on getting A done correctly).
    C) The Request and Response should be in two separate applications. Task A will need to expose the Class Object as the Response, this is Application 1. Application 2 will be the Request that will output the Class Object. 

    So let’s look at  the first part of task A.

    Now as much as I like writing yet another file parser- not really- I opted in this case to use one that is already available. It’s call CsvHelper bu Josh Close. It’s available on NuGet of course. The nice thing about this one is it’s simple and it works with POCO’s.  You wire it up based on your class object.

    Let’s start with my POCO (this was given by looking at the csv that just happens to have the first row with column names):

    public class People
    {
        public string Id { get; set; }
        public string FirstName { get; set; }
        public string LastName { get; set; }
        public string City { get; set; }
        public string State { get; set; }
        public string Country { get; set; }
    }

    Now wiring this into the csv parser is a simple matter of having a text reader and creating a new csvreader with the helper:

    static List<People> giveMeData(TextReader textReader)
    {
        var csvReader = new CsvReader(textReader);
        var theData = csvReader.GetRecords<People>();
        return (theData.ToList());
    }

    Here I’ve method that parses the csv and makes a list of People.

    And that’s it for the first part of task A. The second part of task A has to do with exposing this List of People through NetMQ. 

    We will do this in next part of this blog.

     

  • Microsoft,  Visual Studio 2019

    SSIS (SSDT) and Visual Studio 2019 Oh my

    On my way to moving pass… yes! Visual Studio 2012 (VS 2010 Shell!) to VS 2019, I’ve discovered one really annoying thing. SSIS, or as we call now- SSDT projects, really is not mostly ready for Visual Studio 2019?

    But there is a preview extension that seems to “mostly” work. Go to the marketplace or in VS Manage Extensions and look for SQL Server Integration Services Projects – not the Reporting Server or Analysis server- those are there too, but the Integration Services.

    You add this extension and restart VS…. and you are almost there. 😉  Make sure you actually read the “Known Issues” section on the Overview page page for the extension. Make Special note of Item number 6! (as of the version 3.1). And I will quote:

     

    Variable window and SSIS toolbox may not be displayed properly if .NET 4.8 is installed (Windows 10 1903 installs .NET 4.8 by default). To work around this: 1) open Tools->Options window; 2) navigate to Environment->General; 3) uncheck “Optimize rendering for screens with different pixel densities”; 4) restart VS. For more details of this issue, please see: https://developercommunity.visualstudio.com/content/problem/638322/vs-2019-regression-transparent-toolwindowpane-with.html

    Yeaper… all is fine until you start looking or creating a package and discover the black-hole that is the SSIS Toolbox. 

    Oh My Goodness… so sad… Anyway, the tool box does come back after you change the pixel bah bah and restart VS.

    ~ScottGeek