Skip to main content

Real-Time Talk: Windows 10 IoT Core Background Tasks and ASP.NET Core Web Apps

Display useful information from your Windows 10 IoT Core application in an ASP.NET Core web app, essential for integrating IoT data into a solution.

Windows 10 IoT background task talk with a web application using WebSockets.

Problems

As my path to this solution has been troublesome, I am listing here the main problems I faced so my dear readers have a better idea of dead-end streets along the way:
  • I was not able to make the ASP.NET Core web application run under a Windows 10 IoT background service.
  • I found no information about when or if it will be supported in the near future.
  • ASP.NET MVC and ASP.NET Core have different SignalR implementations.
  • I was not able to make a SignalR client for .NET Core work with SignalR hosted on a web application.
I was able to make things work by directly using a WebSocket. It’s not as nice a solution as I had in my mind, but it works until things get better.

Making the Background Task and Web Application Talk


I worked out simple and not-very-well polished solution to make a Windows 10 IoT Core background task communicate with an ASP.NET Core web application hosted on a Raspberry Pi. I implemented two types of communication on the same WebSocket endpoint. The following image illustrates what is supported.
Windows 10 IoT Core background task and ASP.NET Core web application
To make my sample more informative, I also implemented some primitive functionalities to demonstrate how data moves.


  1. The Windows 10 IoT Core background task reports random numbers to the web application every 10 seconds.
  2. The web application has a simple console to send commands to the Windows 10 IoT Core background task. Supported commands are:
    1. opsys: returns operating system information from the background task
    2. machine name: returns the machine name where the background task runs
Here is a fragment of the web application's front page when the background service runs.
WebSocket sample output with command

    Comments

    Popular posts from this blog

    Python and Parquet Performance

    In Pandas, PyArrow, fastparquet, AWS Data Wrangler, PySpark and Dask. This post outlines how to use all common Python libraries to read and write Parquet format while taking advantage of  columnar storage ,  columnar compression  and  data partitioning . Used together, these three optimizations can dramatically accelerate I/O for your Python applications compared to CSV, JSON, HDF or other row-based formats. Parquet makes applications possible that are simply impossible using a text format like JSON or CSV. Introduction I have recently gotten more familiar with how to work with  Parquet  datasets across the six major tools used to read and write from Parquet in the Python ecosystem:  Pandas ,  PyArrow ,  fastparquet ,  AWS Data Wrangler ,  PySpark  and  Dask . My work of late in algorithmic trading involves switching between these tools a lot and as I said I often mix up the APIs. I use Pandas and PyArrow for in-RAM comput...

    Design of Large-Scale Services on Cloud Services PART 2

    Decompose the Application by Workload Applications are typically composed of multiple workloads. Different workloads can, and often do, have different requirements, different levels of criticality to the business, and different levels of financial consideration associated with them. By decomposing an application into workloads, an organization provides itself with valuable flexibility. A workload-centric approach provides better controls over costs, more flexibility in choosing technologies best suited to the workload, workload specific approaches to availability and security, flexibility and agility in adding and deploying new capabilities, etc. Scenarios When thinking about resiliency, it’s sometimes helpful to do so in the context of scenarios. The following are examples of typical scenarios: Scenario 1 – Sports Data Service  A customer provides a data service that provides sports information. The service has two primary workloads. The first provides statistics for th...

    Design of Large-Scale Services on Cloud Services PART 1

    Cloud computing is distributed computing; distributing computing requires thoughtful planning and delivery – regardless of the platform choice. The purpose of this document is to provide thoughtful guidance based on real-world customer scenarios for building scalable applications Fail-safe   noun . Something designed to work or function automatically to prevent breakdown of a mechanism, system, or the like. Individuals - whether in the context of employee, citizen, or consumer – demand instant access to application, compute and data services. The number of people connected and the devices they use to connect to these services are ever growing. In this world of always-on services, the systems that support them must be designed to be both available and resilient. The Fail-Safe initiative  is intended to deliver general guidance for building resilient cloud architectures, guidance for implementing those architectures  and recipes for implementing these architectures...