Skip to content

Posts tagged ‘LabVIEW’

Let’s Talk About LabVIEW Units, Part 2

In Part 1, I talked about important things to know about LabVIEW Units. In this part, I will focus more on a few use cases where I find them valuable.

Depth of Field with LabVIEW Units

When I was first learning how to use LabVIEW Units, I wrote a photography “depth of field calculator” using LabVIEW Units. As many of my readers know, I’m an avid photographer. Check out bhpowell.com! In photography, the “depth of field” is a measure of how much of an image is in focus.

There are four variables that go into calculating depth of field.

  1. Lens focal length — this is almost always represented in millimeters, such as a 50mm lens.
  2. Aperture — a unitless measure of how big the lens opening is. It’s like our own eye’s iris, which closes down in bright light, and opens up in low light. Because of the way light travels through the lens, a narrower opening gives greater depth of field. Aperture is also called the “F Stop”, and the bigger the number, the smaller the opening. As we’ll see, it’s also a square relationship (based on area), so an f/4 opening lets in twice as much light as f/5.6, and four times as much as f/8.
  3. Focus distance — the further away your subject is, the more distance front to back is in focus. The depth of field is relative to this focus point, so with an extreme closeup, you may only have a few millimeters front to back in focus. Distance is measured in whatever is natural for the country we each live in. Feet in the USA, meters almost everywhere else.
  4. Circle of confusion — this interesting sounding term is a measure of how precise you want to be when you say something is “in focus”. Exact focus is an infinitely tiny point, so the “circle of confusion” is a measure of how much area of the sensor is considered to be the same point. It’s usually measured in fractions of a millimeter, and is usually related to the sensor or film size of the camera.

There’s an example that ships with LabVIEW that demonstrates this simple calculation. It produces three results:

Depth of Field front panel
Block diagram of the LabVIEW example

The near and far distance are the front and back distances considered in focus, so everything from 5 to 7.5 meters is in focus when I have a 50mm lens at f/2.8 focused at 6 meters. The example also lets me replace “m” with “ft” and the code stays the same:

Same example with “feet”

The hyperfocal distance is an intermediate computation, and basically answers the question, “at what distance do I need to focus if I want the far horizon to be in focus?” If I were to run the calculation with focus set to 29.8 meters, for example:

The far distance is computed as almost 75 kilometers away, which is effectively infinite. Oh, as an aside, if you use the “SI” numeric display format, it plays nicely with units.

In the example above, instead of showing “74.50k”, it moved the “k” to the unit and automatically became “74.50 km”.

This example lends itself to LabVIEW Units because the computation is a mix of lengths represented with different units–millimeters and meters or millimeters and feet. Having LabVIEW Units as part of the data type provides some guardrails for my programming. If I try to add or subtract aperture from length, the wires will break, letting me know I’ve done something wrong. If, after all my multiplies and divides, I don’t end up with length as a result, I know I’ve left out part of the computation.


Representing duration with LabVIEW Units

Here’s another simple example of where units are useful.

I have one application that acquires data at about 1 Hz, with a front panel graph that can display the last hour, last day, or last week of data. Originally, I encoded the duration as constant containing a number of seconds:

  • 3600 (one hour)
  • 86,400 (one day)
  • 604,800 (one week)

I added a comment next to each to explain what the values mean.

Then it occurred to me that I could put a unit on that value, and instead use:

  • 1h
  • 1d
  • 7d

This is more descriptive, but also more likely to be correct. (If I mistyped “604800” as “608400”, would I immediately recognize that I did it wrong?).


Converting complex formulas with LabVIEW Units

My third example is a customer application that is computing the time a vehicle will reach a particular travel point, based on the vehicle’s velocity and the distance between sensors monitoring its progress. As an added bonus, the system was designed with the metric system in mind (km/h for speed), but the distance displays are both metric (meters) and imperial (feet and inches).

Originally, the operator had to use a spreadsheet with undocumented formulas of velocity and distance, and then paste them into a LabVIEW front panel. The LabVIEW application then took those numbers to program counter/timer hardware.

I replaced the spreadsheet with a LabVIEW VI and used Units. Again, using Units provided guardrails as I translated the algorithm into LabVIEW. As an example, here’s one of the Excel formulas I needed to make sense of:

=((A$25-K$12)*40-(279-H$30))/10+K$15

Based on other documentation, I began to deduce that these specific Excel cell values represented time. From there, I could conjecture that “279” was also time (e.g., milliseconds), and I also conjectured that “40” and “10” were in kHz. (Or was 10 in micro- or nanoseconds? Or in ticks of an 80MHz clock?)

By assigning units to my LabVIEW translation, I’d get a broken wire whenever I didn’t have a unit right. Using units gave me confidence that I understood my code and could document it when I got done. I even went so far as to replace a formula node (a builtin feature that doesn’t work with Units) with add, subtract, and divide primitives (which do work with Units). I kept the formula as a comment, since its algebraic nature is easier to understand at a glance.


Key takeaways

When you next write a LabVIEW application that represents a physical unit, I recommend that you give units a try. A fair warning that there’s a bit of a slippery slope to using units. If you add units to one control, there’s a good chance that you’ll want to follow that wire and add units to subVIs and other controls and indicators, and you’ll soon find yourself committed to using units in the application. It’s a good feeling, though, when you’re done.

The other piece of advice, which I mentioned in the previous post, is to not force units into reuse libraries. Sure, use units in a reusable VI if it’s always working with length or another physical unit, but sometimes it’s easier to not have units (even polymorphic ones), and let the user use Convert Unit outside of your reuse library. In other words, it’s okay to define boundaries between components where you will add or remove units.

My point of writing these two blog posts is to encourage you to use Units where they work well, and discourage you from using them where they don’t. I could go on about ways I wished they worked better, but I still think they’re a great feature worthy of more attention.

Let’s Talk About LabVIEW Units, Part 1

First, let’s start with a poll…

What do you think of LabVIEW Units?

View Results

Loading ... Loading ...

I’m a fan of Units in LabVIEW, but I totally understand that I’m in the minority. I’m writing this blog post to perhaps persuade at least a few of you to give them a try.

What are they?

A “Unit” in LabVIEW is a label you can add to floating point numbers to tell it what physical quantity the number represents.

Show the unit label

For example, I can assign the unit “degC” to a front panel control, to denote that the value is Degrees Celsius. Similarly, “degF” is Degrees Fahrenheit. If I wire a control representing degC to an indicator representing degF, when the VI runs, it automatically converts the value for me. I don’t have to remember F = C * 9 / 5 + 32.

Temperature conversion

LabVIEW knows about lots of physical units, using the International System of Units (SI). It also knows about prefixes like “milli”, “centi”, “kilo”, etc. If you’d like to explore them, you can right click on a unit label, and select “Build Unit String”.

Help building unit strings

This brings up a dialog to help you create a valid unit string.

Build Unit String Dialog

Note that you can do arithmetic on units. For example, I can represent velocity as “m/s” (meters per second), or “mi/h” (miles per hour), or “cm/ms” (centimeters per millisecond).

Convert meters/second to miles/hour.

Similarly, I could have exponents on units, such as m/s^2 to represent acceleration in meters per seconds squared.

In the example above, I hardcoded the unit as m/s, but I could also have taken a control of unit “meters” and divided it by a control of unit “seconds”, and produced a result that was velocity:

Unit arithmetic

Many of the builtin LabVIEW functions understand units. The unit is part of the LabVIEW data type, which also means that LabVIEW will make sure that any arithmetic you do on units is valid. For example, I can’t add meters and seconds, because they are different base SI units (length vs. time).

Note that you can change the units on the front panel while a VI is running, as long as you enter a compatible unit. For example, in the VI above, I could change “m/s” to any velocity unit (such as “km/d” for kilometers per day) at runtime. Because the unit is part of the data type, you can’t change to an incompatible unit (e.g., change velocity to length) unless the VI is editable.

We’ve seen one rough edge to units so far: almost nobody writes “miles per hour” as “mi/h”. It’s more accepted in the USA to write “mph”. Or what if I wanted to be more verbose, and say “miles/hour”? Well, too bad. The LabVIEW Unit system doesn’t support that. There’s a long list of feature requests for making LabVIEW Units better and more customizable.

By the way, most of the decision makers about NXG saw no value in LabVIEW Units (because they didn’t use LabVIEW for much real-world work), so I think the plan was that NXG would never support Units. That made me sad. But fortunately, that resolved itself when NXG was retired.

How do Units work?

Under the hood, the system uses the base SI units. There are seven base SI units:

Read more

MQTT in LabVIEW

I’ve been experimenting lately with MQTT in LabVIEW. MQTT is an open standard for message-based communications using a publisher/subscriber paradigm. It’s commonly used in the Internet of Things (IoT) world.

I’m working on updating an old LabVIEW application that makes heavy use of DataSocket, an NI-proprietary data communications mechanism. DataSocket has been deprecated, so my customer was interested in updating the communications scheme.

My friend Jörg Hampel of Hampel Software Engineering recommended I look into François Normandin’s LabVIEW Open Source MQTT client and broker. I started by watching François’ YouTube video, which gives a good introduction:

In parallel, I downloaded and installed the software with VIPM so I could follow along with the video. (Just search for MQTT in VIPM.)

Read more

OAuth2 and LabVIEW — Replacing the Web Server

Welcome to what will probably be the last update for this series of blog posts. In the very first article I wrote about this topic, I mentioned that I didn’t like using global variables to pass information from the web callback function to the main part of the application.

I’m finally going to address that design concern by completely changing how I handle the web callback. Credit for this approach goes to a couple of people: Jörg Hampel of Hampel Software Engineering (and the DSH Workshops) who first showed me this idea over Skype, and also Robert Smith, who wrote this excellent article about writing your own LabVIEW web server.

For this situation, I don’t need a full-blown web server–just something simple enough to receive the OAuth2 callback and parse the URI. So instead of depending on any of the three or four LabVIEW/NI web servers, I’m just going to use the TCP functions built into LabVIEW. I recommend you read the article linked above, but I’ll explain my approach here. I’ve updated the LabVIEW 2019 code in the Gitlab repo if you want to download it.

Read more

OAuth2 and LabVIEW — Revisited for 2020, Changes to the LabVIEW Web Servers

Enough about LabVIEW bugs (part 2, part 3)–let’s talk about LabVIEW 2020 features!

One of the significant (and essentially undocumented) new changes in LabVIEW 2020 was changing the default web server from the old LabVIEW Web Server to the new NI Web Server. Unlike the LabVIEW Web Server, which runs as part of a LabVIEW instance, the NI Web Server is a separate application dedicated to just being a web server.

For the most part, you can use it just as before. There’s still a “Start” menu which you find by right clicking on the web service in the LabVIEW Project.

Start menu for starting the web service.

If you select this, the only change you’ll likely notice is that it’s running on a different port, shown below.

Read more

OAuth2 and LabVIEW — A Bug in LabVIEW’s SSL Certificate Handling

After discussing a LabVIEW 2020 bug in my last post in early June, I let the wind go out of my sails and set aside the new version for awhile. Around the same time, one of my other OAuth2 applications stopped working. It wasn’t urgent and I didn’t investigate right away, but it turns out that I found another LabVIEW bug lurking in the way SSL certificates are handled.

Now before anyone panics, almost no one will run into this bug. It does impact all recent versions of LabVIEW–2020 and 2019 for sure, but to be honest, I didn’t go back to earlier versions. I filed a bug report with NI, but it’s not clear if they’ll prioritize fixing it. Details follow…

Read more

OAuth2 and LabVIEW — Revisited for 2020, A Bug in the SHA256 VI

In part one of this 2020 update, I began a journey of updating my OAuth2 example to use a new feature in LabVIEW 2020–the new hash function that supports SHA-256, among other algorithms.

Where I left off, I needed to modify the output of the new VI to create a byte array instead of converting it to a lowercase hex string.

I proposed three choices:

  1. Ignore the 2020 VI and just use the .Net implementation I used in 2019.
  2. In my SHA256.vi, add code after Byte Array Checksum.vi to convert the hex string back into a binary array.
  3. Make my own copy of Byte Array Checksum.vi and remove the subVI which converts to a lowercase string.

Which one did you choose? I decided to try all three. I already had #1, since it was the 2019 version. Here’s a quick and dirty implementation of #2, where I convert the hex string back to a byte array.

After calling the 2020 VI, convert it back to a byte array

And here’s an implementation of #3, where I went and found the VI that called Bytes to Lowercase Hex String, made a copy of it, and removed the subVI call. I replaced it with a straight Byte Array to String.

Modify the vi.lib VI to replace the ASCII conversion with a Byte Array to String

What do you think so far? There are things I dislike about both #2 and #3.

  • In #2, it seems wasteful to convert it to ASCII, and then convert it back. These aren’t large strings, but it just seems like a hack.
  • In #3, I dislike the idea of modifying a vi.lib VI–especially one that’s not on the palettes.

I’m leaning towards #3, because it feels like the right implementation, even if it violates the “don’t mess with vi.lib VIs” principle.

Before I commit to a solution, let’s run the unit tests on each. The results for #1 pass with flying colors, of course. The results for both #2 and #3, though, fail. And I thought this was going to be easy. Keep reading below…

Read more

OAuth2 and LabVIEW — Revisited for 2020, Using the New SHA256 VI

After LabVIEW 2020 released, I thought I should revisit my OAuth2 example to see how I could apply new features to improve the code. I thought it would be simple and straightforward and magically better, and fit into a single blog post. But, I think it’s going to be more of a journey than that.

I decided to start with something simple: LabVIEW 2020’s implementation of the SHA-256 secure hash that’s needed for the code verifier. This ought to be able to replace the SHA256.vi in my 2019 example, which was based on a .Net call. This is one of the only things that was Windows-specific in my code.

My first step was to create a new repo for my 2020 code. I think I want to keep my 2019 version around, so I’m hesitant to create a branch to merge back into it. I may regret it, but it felt like a new repo was the way to go.

Next, after loading the project into LabVIEW 2020, I went to SHA256.vi and selected Find -> Callers. It’s only called from two places, one of which is a unit test helper VI. (Aside: The unit test was only listed because it used a helper VI. If you think that Find -> Callers should have also reported which .lvtest files call the VI, kudo this idea.)

Good, I thought–I have unit tests and can ensure that the new VI passes all my old tests. Spoiler alert: it wasn’t that easy. Keep reading below.

Read more

Best Practices for Using LabVIEW with LXI

I’ve been designing a test system for a customer, and it’s going to be a mix of PXI and LXI instrumentation. Most NI customers understand PXI, but aren’t very familiar with LXI, which is the standard for Ethernet-based instrumentation. Up until about five years ago, I was a member of the LXI Consortium, and closely followed all of its technical developments. For my new project, I’ve been searching the internet so I can catch up and learn about the latest with LXI. While doing so, I realize there’s not a lot out there that discusses how to get started with LXI and LabVIEW (and NI MAX and NI VISA).

Around the same time, a customer asked me to teach the National Instruments “LabVIEW Instrument Control” course, which hasn’t been updated in about ten years. It also doesn’t cover much in the way of Ethernet or USB instrumentation.

So, I felt like I should do something about this dearth of content. I thought a video or two might be the best way to explain how to get started. I asked my friends at the LXI Consortium if anyone could loan me an LXI device, and Keithley Instruments graciously sent me a very nice DMM.

With that DMM on my bench, I created a couple of short videos (attached below). Part one is a good starting point if you’ve never used LXI with LabVIEW before. Part two begins to cover some of the issues if you want to use LXI in a production environment where the system needs to be more robust and maintainable.

For those wanting to learn even more, I’m developing additional custom training to help people who are building LabVIEW-based test systems that include more modern buses like USB and Ethernet. Contact us if you are interested in this custom training.

I’d love to hear your feedback on these videos. Please comment below!

OAuth2 and LabVIEW — Part Four, Reusability

It’s only when you try to reuse your code that you really understand how reusable it is.

– Mr. Obvious

After finishing part three of this series of posts on OAuth2, I went back to my original goal of writing code to interact with my Wireless Sensor Tags for measuring temperature and humidity. The further I went down that path, the more I wanted to change the existing example code.

Since this is a blog post that is partly about how the example evolved, I wanted to write a part four to describe my thought process. The code has been updated in the repository: https://gitlab.com/bhpowell/oauth2-labview-tutorial


Overall file organization

My first step in adding support for the Wireless Tag web service was to duplicate Main.vi. I called the new copy “Wireless Tag.vi”, and began changing out the endpoints, IDs, secrets, and such. To reduce confusion about two top-level VIs, I renamed Main.vi to “Example Get Google Photo.vi”. Not necessarily a great name, but more descriptive than “Main”.

As I began changing out “google.com” and “googleapis.com” endpoints for “my.wirelesstag.net” endpoints, I realized I’d saved several of the subVIs with default values for those endpoints. So, even though I found all of the URLs on the top-level diagram that needed to be updated, when I ran the VI, it was still access Google APIs because of the default values.

Here’s an example for the VI that exchanges the code for the token. First, the original way I used the subVI:

Read more