[youtube http://www.youtube.com/watch?v=9cKBdn4uHyY%5D
Props go to [Michael Nash] for establishing an interface between National Instrument’s labVIEW and an Arduino (an example video using a potentiometer is above). Personally, from the one time we were forced to use labVIEW, we hated every second of it.
One reason it’s so terrible, is the Data Acquisition Modules cost well into the hundreds of dollars, yet the documentation and help resources are very scarce. By using an Arduino instead of the modules, the price and difficulty decrease a considerable amount. Which begs the question why has it taken so long to get a decent (and so simple) of a setup working?
I hated every second of it too when I was trying to learn it for robotics…I just went with c++.
Yeah, I’ve got to second the aversion to LabView. While useful for people with no programming experience at all, creation of anything complex is a terminal pain. Drawing wires? Colorcoded wires for myriad datatypes. Sets of cinematic looking frames for flow control loops…
Too bad he got this working, more people will be tempted to use LabView.
wow, never thought i’d see a labview post on hackaday…
just wanted to put in my 2 cents as a certified labview dev – it is NOT a hobbyist tool. Hell, its not even an amateur tool. labview is something that firms use for quick and rapid deployment of test programs or data acq. environments, without having to know much about programming or hardware. There are clients that dont know why they need labview, they just know that Engineering Firm XYZ uses it and is successful.
I’m not saying i like it or enjoy using it, but cert’ed labview dev’s are in HIGH demand, and do pretty well from a salary standpoint.
And yes, the modules and software are TERRIBLY expensive, but remember it is targeted at the professional sector. Hell, they have a USB to 4-port RS232 switch that is $500!
Anyway, cool project!
You clearly have no idea of the target market/application of NI’s hardware & software. Name me an off-the-shelve system that is as rugged, powerful and easy to set-up as for example the compactRIO system…
I used labview at work for years. It was a must have for the advanced product development team that I worked with, and though it might have a steep learning curve, you can do damn near anything that you can imagine with it.
+1 for posting a labview-related article!
I use both LabVIEW and arduinos often for very different things. I use LabVIEW at work and both at home.
It’s good for a lot of things and not so good at some things. It’s definitely not priced for the hobbyist market that’s for sure.
I found it trivially easy to have LabVIEW communicate with an Arduino just using a simple serial protocol.
I will call you out on the statement that using an Arduino for anything is easier than the IO modules that LabVIEW will talk to natively. The IO modules get better resolution, much better speed, and it takes a LabVIEW newbie just a couple minutes to acquiring data.
LabVIEW markets to a completely different class than the Arduino. The class of people who get paid to get things done fast and not tinker.
Thanks Michael.
Hi,
I use LabView almost every day at work, and it’s true, at first it’s a pain to adjust, and I hated it for a long time. Now I think that you just have to know for what is intended LabView and for what it isn’t.
It’s great for data acquisition and instrument control, it’s also good for process control, but I wouldn’t use it to control a robot, program a uC or for computer vision, even if there is a tool for it in labview.
Anyway it’s clear that LabView isn’t for a hobbyist, you just have to look at the price…
“Personally from the one time we were forced to use labVIEW, we hated every second of it.”
SECONDED. I could not imagine a task that Labview would be better suited for than a simple script written in a general purpose language, provided there was a simple interface to the hardware.
I use labview also each day. To bad you give it a bad name. I like it like i also like playing with arduinos.
I agree with afex. I also use LabVIEW professionally and it has incredible capabilities – that being said if you are attempting to build something very simple one could quickly become overwhelmed or frustrated.
Now having the ability to code and build a spectrum analyzer in about 3 minutes, or an OFDM modulator in an hour… good luck doing that in C.
Seamless integration of a VSA, VSG, Virtex5 FPGA array and RAID array with full support from a single IDE? Incredible.
Of course this is running on a piece of hardware that costs $120k…. but that’s why its well suited for professional use.
I too am a certified LabVIEW developer and use it constantly at work. If you can’t think of anything that would be better suited by LabVIEW than a general purpose language, you’re not looking at the right sectors. Labview’s not for the hobbyist, at all. While Labview does have it’s faults, it’s sure got its strong points, and the hardware from NI is, for the most part, some of the best stuff you can buy. I would have to say some of the better features of Labview aren’t really available on other platforms- such as easy GUI creation (like “falling-off-a-log” easy) and the massively parallel structure that it uses. Sure, I wouldn’t program a cell phone with it, but we have a program we developed here at work that will completely test our piezo actuators, including frequency response. Do that with a microcontroller in less than an hour ;)
@ byr_d
…Well then you must not have been using it for much! Sounds like you are more of a small time hobbyist as opposed to an engineer. If you worked as an electrical engineer, in any EE-related field, you’d know how wrong you are!!
it doesn’t “beg the question”, it “raises the question”
educate yourselves!
begthequestion dot info
I love labview, although it does take a long time to learn. I like it better that C++.
Labview seems to be the quick and easy way when you don’t do the visualisation (and/or control) yourself. The pricing and the fact that I make myself dependent on one company is a no-go for me.
Did anyone use SciLab ? http://www.scilab.org/
Looks like a open source replacement of LabView.
As someone who, also, uses Labview as part of my day job, I have to agree with the other guys here. It can be a great language depending on what you’re doing with it. One of the things that, I think, give it a bad name is the fact that developing in it is a radically different experience that most, standard, languages like C, C++, JAVA, etc. Everything from how you structure you’re logic to documentation has to be done differently in order to be successful with larger projects. Basically, in order to successfully program in it, you have to think a little differently than with conventional languages.
As for documentation and help resources, I have a feeling that you just didn’t know where to look. For most Labview compatible products (which is a large percentage of the data acquisition and automation industry) there is a lot of high quality documentation. Of course, it does vary some based on who the manufacturer is. As for help, NI runs a pretty active developer community, through their website, where you can go for assistance.
maybe i’m remembering wrong, but labview software is pretty freaking expensive too. I took a class in labview. I’ll admit it’s impressive in what it can do (and the easy GUIs are nice), but i didn’t enjoy it at all. probly would’ve helped if our computers had something better than pentium 4s tho.
anyway, i find it ironic in this context that people usually complain about using an arduino to do a simple job, because using something like labview to interface with an arduino is the very essence of overkill.
honestly, unless i were designing a very intricate instrument interface, i’d prefer any uC writing voltages to a text file over labview. maybe i’m just lazy, but i’d say just recording data isn’t enough to justify using labview. recording and interpreting data live while controlling the instrument…that’s a job for labview. or just planning ahead.
Hello!
That’s why I prefer Agilent VEE. The program and its methods are easier and it even respects the way we work. It runs currently only on Windows. But that’s not a fault.
LV on the otherhand, well I don’t like it because it is too complex and the help screens are not helpful. Also the evaluation period is fixed. Which seems to think that most companies even today have deep pockets.
“Drawing wires?” Yeah, that sure is terrible having to define dataflow with wired connections in some sort of graphical environment. That’s why I do all of my schematics work in C++.
If cost is a main complaint, you might gain some perspective by comparing the price of a walmart multimeter to a Fluke DMM with NIST traceable calibration certificate.
@torpid:
“That’s why I do all of my schematics work in C++.”
I know you’re being sarcastic, but VHDL/Verilog exist because it isn’t practical to hand-draw schematics for massive integrated circuits.
I think people are getting a bit confused about what Labview is well suited to. It is well suited to making nice user interfaces to lab equipment. It can be very handy for technicians and engineers working on tying together several instruments while performing tests. It is not well suited to highly complex programs. It is not a good choice when you need very high reliability.
It was originally intended for lab workers who were not programmers by trade, however there is now a large number of professional Labview programmers who are quick to defend the language.
Difficult to interface with your own modules? I think not, I managed it on my first serious day of LabVIEW work.
At the moment I am working on a project involving a compactRIO system, but as I had no experience with LabVIEW I set myself the task to gather analog data from a PIC on the usb port and display this in LabVIEW. It was a task which, with no previous experience, took me just a few hours.
@r_d
My only experience with VHDL is writing code for FPGAs or CPLDs, and it seems to me that writing code for FPGAs requires a little different mindset than writing “typical” user-interfaces, data acquisition and processing, machine control, and data-logging in a PC environment. That being said, there is LabView FPGA which I think is tailored to programming the FPGA in the cRIO backplane. I have found it quite straight-forward and powerful.
As a counterpoint, there are times when I prefer to write code instead of wiring it. Writing equations with many variables can get cumbersome for example. So, for those situations I use a mathscript or formula node and write it all out with m-code format. I can easily access .NET libraries, use activeX, run arbitrary commands in the commandline, or whatever. Oh, one other downside is that it is really easy to write HORRID obfuscated code. You can make sections of your code literally transparent. That, I would say, is a downside.
We’re hackers. If you’re whining about being held back by a mature programming environment, then I think you’re doing something wrong. For personal use I would not recommend LabView (along with NI equipment) due exclusively to the cost. But, I’d say that over the past 20 years or so, LV has proven itself useful (at least to those whose opinions actually matter) in data acquisition, machine control, data processing and logging, data presentation, user interfaces, etc.
I am a Certified Labview Developer like several of the other posters, so I do get paid to learn the stuff. As such, you may consider me biased.
I’m suprised at how many HAD followers are labvbiew developers. I myself write labview code for my job.
I’ve programmed in several languages, C, C++, C#, pascal, VB, Delphi, Ruby. All fun to program in. But labview has an ease to data acquisition.
Let me just say this about labview. We had an engineer (EE Masters degree) work for 2 years on a single Altera FPGA chip to control 16 video cameras at the same time. Using serial control and COAX control, which is PTZ control on the video line.
After 2 years…..Never finsished it. I (AS computer tech degree) spent one day with a sbRIO and our custom interface board and had COAX PTZ control working over the network. And with NI help the rest of the application took a month to design.
So how much does it cost to pay an EE for 2 years work and how much did it cost for a sbRIO and a month for me to work on the same thing….
Anyways, I hope ill be seeing everyone at NIWeek 2010! i just signed up today for it. Taking the FPGA class as well. 6th street austin here i come!
Maybe all this Labview issue is just like the Lotus Notes issue. Lotus Notes sucked badly, was obsolete, migration was a problem, everybody new that, but there where plenty of jobs as Lotus Notes “certified developers” in the stupid corporate IT world who insisted on using Lotus Notes bacause it was IBM’s…and if you buy IBM you can´t be wrong….or get fired because of.
p.s. labview reading arduino with connected POT? seriously….I think im going to post an arduino controlling an LED from labview using the labview restfull service from a cellphone! Usefullness of it….remote panic alarm!!
I’m still in high school(Romania) and I got to play with LabVIEW after we got the Vernier interface. My physics teacher asked me to wrap my head around it back in 10th grade. I didn’t do anything complicated with it, just standard data collection and some output tweaks.
For educational purpose I think it’s really useful and easy to use. I didn’t encounter any difficulties while trying to set up sensors and such.
Perfect example is a terminal for a pid control:
Simple interface to set P,I,D values and charts to watch the process variables.
I bet no one could do this faster than in labview.
ok, you have exactly to know which datatypes your are wiring, coding experiences assumed.
any questions?
LabView is a horrid language as far as the design is concerned. Documentation is poor, graphical layout is poor, no zoom, between completely different functions might be very small distances on the screen, easy things like 1+1 take far to long to do (create to constants and a “plus” triangle, connect them with wire), it is hard to modify existing programs due to complexity, debugging is also not a strength of LV. It is really just a tool that makes sense in process automation and control, and, where you pay NI to do you programming for you (what using commercial VIs and hardware is, basically). But from that I would be hesitant to say Labview is generally good not generally bad. What is bad however, is the stupid, useless task the total overkill solution here is used for.
I, too, am suprised at the number of HaD readers who are certified LabVIEW developers. While I cannot claim to be part of this crowd, I do have [minimal] experience with LV.
Deviating from the current thread, however, I always try to remember my lowly beginnings. There was a time that I had to learn to solder, and had to scrape to afford a somewhat-better-than-cheap multimeter. Undoubtedly, there are readers who have never played with data acquisition environments/hardware due to cost and availability. I see this post as a great way to begin sharing info on creating cheap home laboratory setups for the financially challenged hackers.
Unfortunately, the only hack I can think of at the moment is the old method of using a soundcard as an oscilloscope, but I would love to hear ideas from the rest!
“Personally from the one time we were forced to use labVIEW, we hated every second of it.”
I guess you either love it or hate it. I’m in the “hate it” camp:
– can’t document it (it’s graphical)
– can’t archive it (setup/initial conditions don’t get saved)
– doesn’t work the way you expect it to (user interface is not “Windows standard”)
– seems to be interpreted, and therefore reacts slowly…*too* slowly.
– oh yeah, the price, and the node locking license
We have tried to use it here at work, and it’s just too frustrating to use. You *can* get it to work, but it could really use a top-to-bottom redesign. The problem is, it’s a software development environment, but it doesn’t obey any of the rules. I should be able to ZIP up a Labview environment and send it to someone, who could then install it on his platform and reproduce my experimental setup (with all initial conditions, sampling rates, input scaling and such)…but I can’t do that, and that’s why I hate Labview.
@Peter
– You can document it. Labels are used for this purpose.
– You can choose to save your initial conditions if you write it so just like any other UI.
– Whether it works as you would expect is a matter of opinion of course.
– It isn’t interpreted, it’s compiled to machine code.
– No argument about the price.
anyone got a torrent of it because it’s not worth that price tag.
It sounds like most of the people that hate LabVIEW hate it because they don’t know how to use it correctly. It is definitely more suited for some tasks than for others. There is TONS of documentation and NI runs a support forum where anyone can ask and answer questions, as well as get their questions answered by certified developers. This support is completely free and on top of the phone support you get for purchasing it. Documentation of VIs is easy – you can type text anywhere you want in both the block diagrams and front panel. There is no zoom because if your block diagram is bigger than one screen you’re doing it wrong. I’m not certified yet but I do use LabVIEW for my job and I recently took the Core 2 class. I am studying for the developer certifications and plan on taking the tests this year or next.
@fartface:
Look at the usual sites: piratebay et al.
Or use filestube.com to look if someone uploaded it to rapidshare/megaupload and the like
‘it doesn’t “beg the question”, it “raises the question”’
If you persist on using obsolete meanings for common words, you are going to be misunderstood or ignored. No-one uses ‘beg’ to mean ‘avoid’ except the set of people who get angry about the use of ‘begging the question’.
Let it die. Language moves on.
For several years during college I worked in a lab testing Gigabit Ethernet devices. We had some very complicated LabVIEW programs. Although as an aspiring software developer at the time I didn’t like the tool much, the tools we made with it were very impressive. Some of the people who made them got some had ridiculously high salaries right out of school thanks to the work too.
My old desk, a lot of the stuff they do here would be great stories on HAD. The FPGA connected directly to a SmartBits / Logic Analyzer is just one of many great hacks that are used every day:
http://www.iol.unh.edu/services/testing/ethernet/tools/gmii/
Ok, so we’ve established that Labview is expensive, powerful, and not everyone likes it.
Better we hobbyists and homebrewers look at open source options for such work.
As someone else mentioned, there is Scilab:
http://www.scilab.org/
Another interesting package is PureData:
http://puredata.info/
and Flossmanuals writeup of puredata:
http://en.flossmanuals.net/puredata
can you offer any others?
I love the comparison between LabView and Lotus Notes. For me it’s, particularly, relevant as we, unfortunately, use Lotus Notes here at my job (on top of using LabView). However, I would have to say that the comparison itself is 100% wrong. The difference is that, as has been seen here, people that develop in LabView on a regular basis, by-and-large, like it. On the other hand, even people that work in Lotus Notes on a regular basis usually admit that it sucks. From my perspective, price is the only really legitimate complaint. The other complaints (lack of documentation, lack of help, no ability to comment, no ability to save settings, etc.) are all examples of people that don’t know how to, properly, use the tool.
The simple fact is that it’s a graphically based data flow language. It’s a radically different paradigm than most programming languages. Personally, most of my computer science education and experience has been in conventional languages like C and C++. I love those languages. However, after using LabView for a while, I’ve come to respect it as being a powerful tool as long as you’re willing to put in some time to get used to the differences.
Hi Guys, LabView is great. CompactRio is great. The prices are high as the licences and the hardware are very expensive. It’s a must for development purposes and instrumentation applications. For out-of-the-lab, real world scenarios, (i mean, production in series, high reliability, low power consumption, hardware simplicity) it can’t compete with a custom designed hardware (with cheap microcontrollers and DSPs) and software (with C/C++ code, assembly optimizations, etc.).
Even for these real world applications, people usually develop in LabView (or in any other development framework, such as MatLab, SciLab, whatever) to rapidly create prototype solutions and test the main concepts behind it. These tools are required because they hide the complicated aspects of a data acquisition platforms, executive/operating systems, etc.
After a initial/prototype design, people intended not to be stuck with NI tools forever, start porting the LV application to custom hardware. This porting is the problem. LV is not only a language, it is a real time framework comprising a real time executive/operating system (for embedded applications) and the real time runtime for the G language. This is where the madness begin.
The G language is very abstract compared to C/C++ code as it can be executed in parallel, where the hardware permits. Directly porting it to other platforms requires the LV Runtime or C/C++ code generation tools (which still requires libraries from National) and a competent Executive/OS. The original prototype performance and results using the dedicated National HW/LV are *hard* to obtain on the custom HW in this way. The debugging of the generated code is insane. The maintenance either. The documentation of generated code is completely absent. The memory footprints are BIG. The program footprints are BIG. The code can’t be easily post-optimized by hand.
If you *can* afford the National Instruments LV/HW related costs, if you think the National hardware is quite reliable, if you can afford the HW power consumption, size, housing, etc., if you can survive forever stuck to NI, PERFECT, you found the solution for all you problems, NI/LV is the yellow brick road to your systems. It only a question of living in or not inside a “National” shell.
If not, keep using other frameworks, you FREE/STANDARDIZED preferred language, create you own toolbox of functions, routines, use and abuse of GNU tools and cross-compilers, code revision tools, version control tools, case tools, whatsoever.
Maybe if National turns the G-language, LV and LV/RT to be FREE, open source tools, with tons of third-part hardware/software suppliers, if they adopt other languages inside its framework (hey National, didn’t you hear of IEC 61131-3 for control/automation, IEC 61499 ?), this could be a de-facto solution for us all !
Greeting to all hardware hacking community!
Eduardo Pellini
Ha! Well this sure turned out to be a hot topic. ;) Guess I’ll chime in my 2 cents. I’ve used LabVIEW extensively. I’ve found its utility to be entirely related to the intended application. I think the power in it lies in it’s prebuilt modules. For instance, if you are building an instument test platform like a bed of nails and need to do signal analysis using a Fourier transform module it is a pretty good choice. If you are going to do sequential process automation for say a chemical vapor vacuum deposition oven, it is a royal nightmare. I think many people like it for it’s ease of use in creating GUIs. Funny how no one noted it’s worst attributes of all, those being difficulty in debugging and the inability to resume a defined program startup state after a crash. If you like watching signals “flow” over wires while debugging this is just the ticket for you.
Interesting discussion.
I used LV several years ago for a car manufacturer to measure electric car parameters. LV is good for what it is done (easy measures under industrial constraint such as time to market, eyecandy GUI, and certified designers…).
I also used Matlab/simulink/RTW for controlling hybrid car prototype (hardware in the loop in 1999 !). Really great tool (very expensive also) for that purpose. These softwares could be used for measures recording, but are probably not as good as LV for that purpose.
For hackers, free equivalent solution may be Scilab/Scicos/RTAI (https://www.rtai.org/RTAILAB/RTAI-Lab-tutorial.pdf). I never tested but I have seen interesting demo and project is active.
Labview is a great tool for its readability. Double click on the block diagram and leave notes wherever you want or embed them in the description field of the components. For quick prototyping and powerful features, your only setback is price. Most lab equipment manufacturers supply vi’s for their equipment, so it is a matter of dropping in the components. The design paradigm is very different from conventional programming and each program requires the labview runtime environment, but for those of us that don’t habitate in front of a computer for our office work, it works wonders quickly. Reliability is as good as the coding and you can make the code extremely complex if required. If you don’t like it, don’t use it.
I’m not using my name in fear……
…If you want to use labview forever….register with NI, download the 30 day trial version but first set your clock forward a year :) after youv’e run it once, and see you have 30 days, set the clock back to this year. enjoy 395 days of use. A new version comes out each year now.
Repeat and Enjoy.
How about this for a hack using LabView.
http://decibel.ni.com/content/groups/diy-labview-crew/blog/2009/10/20/a-commodore-64-emulator-written-in-labview
C64 emulated using labview code.
Ok we had enough good vs. bad stuff, thus I will not add my opinion on Labview here.
However, without criticizing the present “hack”. I guess everyone who ever played around with Labview and uC did exactly what was announced here as a big new thing.
Come on reading out serial data from a uC in Labview. This is a task, every university undergraduate student might find in there practice course. Furthermore, every dev who has a device with serial communication would do the same. What is really new here ?!!?
Or is it simply the fact that arduino makes such a nice companion to Labview and vice versa ;)
Maybe
Personally, I love LabVIEW. (Full disclosure: I’m also an NI employee.) One of the coolest aspects of my job is seeing all the amazing things that people are doing with LabVIEW – stuff we never would have thought of. (like this example!) I’m happy to see all the discussion here too. Of course, I can’t resist the opportunity to share some of my own “insider” thoughts…
LabVIEW was originally concepted to graphically represent the way many scientists and engineers (and even some programmers) think: with flowcharts. The vision is to empower the domain expert with a tool to get the job done – without requiring a CS background. Does this mean the data flow programming paradigm makes sense for every application? No…nor is it intended to. But can it help solve the engineering challenges that we (the scientific community) face every day? Absolutely. Our goal is for engineers to spend more time finding solutions to problems and less time worrying about how to build them.
Yes, LabVIEW costs money. That’s unlikely to change in the near future, but I believe our customers see the value proposition. I think pRoFIT (above) said it best with: “So how much does it cost to pay an EE for 2 years work and how much did it cost for a sbRIO and a month for me to work on the same thing….” We’ve seen some pretty powerful demonstrations of LabVIEW in other scientific endeavors as well.
We know LabVIEW isn’t perfect. Our R&D team wouldn’t have much to work on if it were. That’s why we want to hear this feedback. Posts like this get circulated all over the company – trust me, there’s a LabVIEW R&D engineer reading this right now… Honestly, send us your ideas. In the meantime, I’d love to see more of you LabVIEW developers post your projects on Hack a Day!
The reason this is cool, is that an arduino is 1/6th the cost of the cheapo Ni-Daq boards. And they don’t even have hardware PWM.
+1 for this link. Sent it to my professor. The new set of undergrads just got unleashed on the equipment and they are running out of working Ni-Daq’s.
Also looking into another post where arduino interfaced with matlab