Linux Fu: Better Bash Scripting

It is easy to dismiss bash — the typical Linux shell program — as just a command prompt that allows scripting. Bash, however, is a full-blown programming language. I wouldn’t presume to tell you that it is as fast as a compiled C program, but that’s not why it exists. While a lot of people use shell scripts as an analog to a batch file in MSDOS, it can do so much more than that. Contrary to what you might think after a casual glance, it is entirely possible to write scripts that are reliable and robust enough to use in many embedded systems on a Raspberry Pi or similar computer.

I say that because sometimes bash gets a bad reputation. For one thing, it emphasizes ease-of-use. So while it has features that can promote making a robust script, you have to know to turn those features on. Another issue is that a lot of the functionality you’ll use in writing a bash script doesn’t come from bash, it comes from Linux commands (or whatever environment you are using; I’m going to assume some Linux distribution). If those programs do bad things, that isn’t a problem specific to bash.

One other limiting issue to bash is that many people (and I’m one of them) tend to write scripts using constructs that are compatible with older shells. Often times bash can do things better or neater, but we still use the older ways. For example:

if [ $X -gt 0 ] 
then ....
fi

This works in bash and a lot of other similar shells. However, bash can do better, for example, working on strings instead of integers:

if [[ $X > 0 ]]
then ...
fi

Features

Don’t think bash is a programming language? It has arrays, loops, sockets, regular expression matching, file I/O, and lots more. However, there are a few things you should know when writing scripts that you expect to work well. You might add your own items to this list, but this one is what comes to my mind:

  • Use “set -o errexit” to cause the script to exit if any line fails
  • Use “set -o nounset” to cause an error if you use an empty environment variable
  • If you don’t expect a variable to change, declare it readonly
  • Expect variables could have spaces and quote accordingly
  • Use traps to clean up your mess

Exit on Error

If you use “set -o errexit” then any line that returns a non-zero error code will stop script execution. You might object that you want to test for that condition like this:

some_command
if [[ $? != 0 ]]
then 
   recover
fi

If you use the errexit flag, that test will never occur because once some_command throws the error, you are done. Simply rewrite like this:

some_command || recover

The effect is if some_command returns true (that is, zero), then bash knows the OR operator is satisfied so it doesn’t run any more commands. If it fails, then bash can’t tell if the OR is satisfied or not, so it runs recover. The exit code of the entire thing is either 0 from some_command or the exit code of recover, whatever that is.

Sometimes you have a command that could return an error and you don’t care. That’s easy to fix:

some_other_command || true

By the way, usually, the last item in a pipeline determines the result. For example:

a | b | c

The exit code of that line is whatever c returns. However, you can “set -o pipefail” to cause any error code in the pipe to halt the script. Even better is the $PIPESTATUS variable which is an array with all the exit codes from the last pipeline. So whatever program b returned will be in ${PIPESTATUS[1]}, in the above example.

Unset Variables

Using “set -o nonunset” forces you to initialize all variables. For example, here’s a really bad script (don’t run it):

TOPDIR=tmp
#rm -rf /${TOPDIRR}

You can argue this isn’t great code, but regardless, because TOPDIR has a typo in the last line, you’ll erase your root directory if this runs without the protective comment in front of the rm. This works for command line parameters, too, so it will protect you if you had:

bad_cmd /$1

Readonly

Many times you set a variable and you really need a constant. It shouldn’t change as the script executes and if it does that indicates a bug. You can declare those readonly:

readonly BASEDIR="~/testdir"
readonly TIMEOUT_S=10

Expect Spaces

File systems allow spaces and people love to use them. This can lead to unfortunate things like:

rm $2

When $2 is something like “readme.txt” that’s fine. However, if $2 is “The quick red fox” you wind up trying to erase four files named “The,” “quick,” “red,” and “fox.” If you are lucky, none of those files exist and you get errors. If you are unlucky, you just erased the wrong file.

The simple answer is to quote everything.

rm "$2"

If you ever use $@ to get all arguments, you should quote it to prevent problems. Consider this script:

#!/bin/bash
function p1
{
  echo $1
}

p1 $@
p1 "$@"

Try running the script with a quoted argument like “the quick red fox”. The first function call will get four arguments and the second call will get only one, which is almost surely what you intended.

Traps

It isn’t uncommon for scripts to create temporary lock files and other things that need cleaning up if the script stops. That’s what the trap command is for. Suppose you are working on building a file called /tmp/output.data and you want to remove it if you don’t get a chance to complete. Easy:

trap "rm -f /tmp/output; exit" INT TERM EXIT

You can look up the trap command for more details, but this is a great way to make things happen when a script ends for any reason. The exit command in quotes, by the way, is necessary or else the script will attempt to keep running. Of course, in an embedded system, you might want that behavior, too.

You probably want to remove the trap before you are done unless you really want output.data deleted, so:

trap - INT TERM EXIT

Wrap Up

You should consider turning off features you don’t need, especially if taking input from outside your script. For example, using “set -o noglob” will prevent bash from expanding wildcards. Of course, if you need wildcards, you can’t do this — at least not for the part of the script that uses them. You can also use “shopt -s failglob” which will cause wildcards to throw an error, if you want to secure your script.

Speaking of security, be very careful running user input as commands. Security is an entirely different topic, but even something that seems innocent can be maniuplated to do bad things if you are not careful. For example, suppose you secure sudo to allow a few commands and you offer the script:

sudo -u protuser "$@"

If sudo is set up right, what’s the harm? Well… the harm is that I can pass the argument “-u root reboot” (for example) and sudo will decide I’m root instead of protuser. Be careful!

There are a lot of tricks to writing bash scripts that are portable. I don’t care about those in this context because if I’m deploying an embedded system on a Raspberry Pi, I will control the configuration so that I know where /tmp is and where bash is located and what version of different programs are available. However, if you are distributing scripts to machines you don’t control, you might consider searching the internet about bash script portability.

If you want to catch a lot of potential errors in scripts (including some portability issues) you can try ShellCheck. You might also appreciate Google’s shell style guide. If you aren’t sure bash is really a programming language, this should convince you.

85 thoughts on “Linux Fu: Better Bash Scripting

    1. No point if you know perl.

      Perl is the ultimate administrative language. And it extends the bash syntax. It’s much faster and more powerful than any other shell including python shells. It’s also very unix oriented.

          1. Wow. If you think Perl is more cryptic than Bash, you’re doing one or the other very wrong. Perl, like Bash, has a history of misuse and lack of best practices. However, unlike Bash, Perl now has widely-accepted modern coding practices (“modern Perl”) and is used by large organizations in in large projects involving their core infrastructure. Don’t judge a language by how people misuse it. Especially don’t judge Perl by amateurs who think the llama and alpaca books represent modern coding standards and who never move beyond them.

      1. Perl has very little in common syntax wise with bash. It’s closer to Awk and Sed in origin, and has a lot of it’s own constructs that neither of those two have.

        1. Thank god someone said it. If something starts getting unwieldy in bash I’d prefer to jump to python, at least I know the next person to touch the code is going to have an easier time reading what I’ve written.

      2. There is a point if Perl isn’t installed on the machine you are working on, and there is no way to get it installed. This situation is not uncommon in many high security industrial settings with legacy hardware and operating systems.

        You need to be aware of multiple ways to solve a problem and use tool that makes the most sense in that particular situation.

        Statements such as yours are very narrow minded and show a lack of real world experience.

      3. Perl was great when it was the only game in town but it’s been superseded by better languages. It has multiple ways to do everything which makes working on someone else’s code a nightmare.

  1. There is one simple thing to know about using bash as a scripting language: DON”T !!

    If I do anything that is more than a few lines with linear flow, I immediately jump to a real language. These days I go for ruby every time, but it used to be perl for me. Other people would be advised to go with python.

    A good rule of thumb would be to use a “real language” for anything longer than 12 lines — trust me.

    Yes, bash is a full blown programming language, but a wretched one. The only thing worse is Tcl.

    … It is fine as a shell though, and I think the best choice in that arena.

    1. Same. I’ve seen amazing things done with Bash, but I’m already more fluent with tools better suited to complex logic.
      I use Bash for simple automation of Linux commands, but for anything more complex I usually switch to Python.

      1. Thank you Steven. I have been working with Unix since 1988 and Linux (0.99 kernel) since 1991 when it was first released. I have worked with all the shells, but I switched 4 years ago to Python and pretty much love it. All languages have their headaches and none are perfect. I use Bash for .bashrc or .bash_profile functions. Or small 6-12 lines scripts, since it is my fallback language. But use Python for everything else. I like the OOP library, as I’ve seen SA’s rewrite the same code over and over in Bash. Perl has tried to wrench it in but seems to be more a hack then a rewrite of Perl. I do not care for Ruby’s syntax which is one of the things I like about Python. Syntax is the first thing that will trip me up. As I stated earlier no language is perfect and I’ve seen languages come and go. But for today Python seems to be the best choice for System scripting/coding. Tomorrow… who knows? As someone as said in this comment section that using the best tools is the primary thing.

    2. I am going to disagree. I mean, sure, I like programming in a better language, too. But the thing that bash gets you that Ruby doesn’t is its near ubiquity on modern *nix platforms. And if you limit yourself, you can grow that footprint by presuming ksh, for example. Granted Perl and Python are almost there too, although I have had enough Python hell with version mismatches that I’m wary of it. Perl is a bit more universal, but depending on what I want to do, I’d prefer shell–Perl is a bit of everything all in one which makes it baroque.

      At the end of the day, we all have our favorite tools, but there is nothing wrong with bash other than people tend to NOT use its abilities to be robust because it doesn’t force you to do so.

      1. On any system worth spending time on these days a package manager will install ruby or python or perl in seconds, so the ubiquity argument is totally bogus. Perl is weird, but always gets the job done. Ruby is slick, some people call it “perl done right”. These days I always reach for ruby.

        1. You must be indeed fortunate that you work on machines that don’t belong to clients or have draconian IT restrictions. I promise you that if you try to do wide deployment you WILL find users that have machines that do not have ruby and have no mechanism to install ruby. Or any other thing you name. Yes, bash isn’t 100% universal. But it is quite good at being most places without cross platform issues and strange wacky versioning and library problems.

          I mean, look, ruby is fine. So are a lot of other things. Use what you want, but the fact that you think “totally bogus” tells me you don’t deploy to a very large and varied set of users. If you are carving a pinewood derby car, a chainsaw seems stupid, I guess. But if you are a lumberjack, they are pretty awesome.

          1. Well, OK, you’re a tough crowd. I certainly won’t argue with using whatever tool gets the job done. In my world though these days it seems that if you have *nix you have it all. On the other hand if I am doing deep embedded stuff, I don’t have bash or a linux kernel and am writing C to run on bare metal, but you are right, I am usually in a situation where I am calling the shots and not vice-versa.

          2. I concur. I used ksh on SunOS once to emulate expect and upload firmware line-by-line to an attached device. Would and expect script have been easier? Of course, but these were in telephone COs were there was no internet connectivity and X.25 was king.

            Kids these days can’t conceive of a day when the internet wasn’t at their fingertips, Am-I-right, Al? Humbug!

          3. Yep, only systems I’ve seen where you could just slap on a new lang env was startup envs, like the ultra-hobbyist way of things.
            Anywhere else – no of course not.
            Though in fairness, Python has its place as a default now, and Ruby *would have* if not for the versions.
            At a startup we rpm-build ruby in-house because there’s no source for a current & secure ruby on centos (the languages team itself delivers an outdated one)
            I dislike python for systems code because it encourages laziness on the error handling.

          4. There’s nothing draconian with having a proper IT security.
            Those places just try to not be the idiots in the newspapers.
            Having no extra sw installed, having no outgoing internet access on web servers, having not this than and then that *is* the one barrier that lets your system live another day.
            Using bash, in that context also means all your scripts will still work after some emergency patch fun.

        2. Yes…and in a large organization, how often does the person performing a task that needs to be done NOW actually have the ability to install packages on a system? And how often is there an application on the system that has a dependency on a specific version of {Python|Ruby|Perl|Whatever}? In my organization, the answers are “rarely” and “frequently,” in that order.

      2. Exactly. On a “tight” machine that is pretty-much locked down, use the native shell (often BASH). If you need to parse a lot of stuff, install PERL 5.x and trim it down for a lot of reasons (including safety, not to mention bloat in default PERL installs). The future, especially in the embedded world is owned by the “old” tools at the CLI. As for bloat in BASH, you can pare that down, just like PERL. Same for any *nix. netBSD is a flavor that is easy to tame, then for added security, pick and place what you like from OpenBSD. Pre-production, dev in FreeBSD. The concept of eschewing shell scripting etc. that you control for something like Ruby is just abominable IMO!

      3. Agreed! Bash is on almost everything these days. While Python is certainly my preferred language for admin tasks, I find versions anywhere from 2.6-3.5 on systems these days, and with both Python and Perl, I can’t ever depend on the modules I need being installed.

    3. Not everyone has access to install whatever language they want (ex work in a government/corporate environment), this means you often have to use bash or any of the other standard scripting languages that comes with linux/unix or js/cmd/vba/powershell on Windows.

      YMMV though… :)

          1. djsmiley2k, as someone who spent more than 30 years on that front line, I salute you. MS-DOS/Powershell batch script is cryptic, bizarre, archaic, ….and vital to know. I can remember writing scripts that passed themselves to DEBUG to extract and create a .COM file to add commands that the script needed because the security environment would only allow scripts, not extra programs.

    4. I disagree.

      There is a certain set of tasks which are solved MUCH easier in bash (or similar shells) than in a “real” language like python: taking the output of one external program and piping it to the input of another one.

      Yes, there is e.g. asyncio in python, and it can do really cool and complex things, but setting up e.g. basic piping between three programs is much more complicated than the one-liner in bash.

      So if the task involves a lot of piping between programs, I tend to use bash for it even if other parts of the program are better solved with a “real” language.

      1. Unless you *really* can’t just read all the data into memory then doing that in a real language (python, go, etc.) is easy and will get you a program that isn’t full of bugs.

        I can’t think of a single non-trivial example where bash is best.

    5. Not if you are dealing with a network install of a UNIX OS. At install time, there is no other option really than a shell script at least until you can reboot and continue with a run once, delete after running, startup script which would also be a shell script.

      You could do something a bit more complex involving remotely a mounted applications directory with perl/ruby/python/….. but at install time, that is just asking for problems.

    6. I disagree. There are many time where bash is the obvious choice. If you find yourself calling Popen in your python script more than once, you’re doing something wrong. Just use bash, as its forte is launching other programs and piping their output elsewhere. Its syntax is a bit verbose, but easier to read than perl, I would argue, and its library is the full set of *nix applications and tools. If you know bash but don’t know sed, awk, and the rest, you’re selling yourself short.

      Bash is ubiquitous, stable, and the right choice for many tasks, not all, of course. Use the best tool for the job, even if it’s not your favorite.

      Now TCL; that’s a wretched beast, for sure. That and DOS batch…

    7. Another good rule of thumb would be to use a “real language” when decimal numbers may be involved, like calculating averages, timestamps with higher resolution than seconds, any parameter that could possibly need to be a non-integer number in the future, etc. There are ways around it, and if all else fails you could just multiply all inputs by 1000 and split output variables to insert a decimal point in some arbitrary location, but it’s not nice. I too would recommend Python, or just pain C when most data is numerical (or multidimensional array’s) instead of strings.

    8. In the early days I avoided Perl: Anything I could do in Perl I could do faster in awk and sed. Perl has evolved. I thought I was going to work for a company that really depends on Perl, so I made an effort to learn it. Some things are cool (such as the ability to use meta-programming to generate conditional scripts). (In the old days we FORTRAN programmers called those meta-programs “macros”. Perl does it better than FORTRAN did.) The upshot? I STILL don’t like Perl. I would rather do shell scripts than Perl for most smaller tasks, and if I need to do something more elaborate, I prefer Python. (BUT, for real development I prefer LISP/Scheme/Racket, assembly, C, and FORTH. Take my weirdness into consideration when you evaluate my reply.)

      But the thing I like best about shell scripting is the feeling of “control over my system”. The hints in this article are good points for better scripting.

    9. Always happy to find scripts in higher level languages like this.
      Then I can be 100% certain I’ll find no reasonable error handling in them, obscure race conditions and since it’ll then be “vendor-style” also not be allowed to replace and fix the crap.
      Oh and not to mention Python & Ruby code with the slight version regressions where suddenly an api query stops working because a 3rd software upgraded a 2nd lib.

      All fine in normal app dev, but intolerable for systems code.

      Keep in mind that system interruptions in mission critical systems are 30% hardware, 40% human error and 30% software bugs. Aspiring to go big in sw is killing things all the time, and what raises the human error bit so high (bad clis because now time was spent on doing some cool OO stuff instead, just for example)

      I know a few people who think they’ve masted bash so well (and fuck up) but they’ll also constantly ask to go to perl (and they fuck up there, too).
      The lesson I took there is: If they’re not modest & thoughful enough to write safe bash code, it doesn’t matter if they switch languages. If they’re very good infra coders, they _can_ use another language, but probably won’t even need to. They’ll describe the practical advantages for the current project, not “bash is bad for coding”.

  2. Do try to avoid bashisms. Especially in embedded environments you’ll probably want a less resource-hungry shell. You’ll also stay compatible if you decide to eg. try one of the *BSDs, which don’t use bash by default. Several Linux distros also use bash only for the interactive shell, and use something else for /bin/sh.

    1. Yeah, incompatibility between shells is why I always write my shell scripts for dash. Being a minimal standard-compliant shell, if it works in dash it’ll probably work in anything else.

        1. I also normally write for posix shell only. I’ll likely do a few convenience things that only work in ksh/bash but normally those will be only where it’s possible to rip out and replace them with limited effort.
          IMO main issue is indeed bash here which doesn’t properly limit its syntax when called as “/bin/sh”, meaning people *try* to write portable code, but it isn’t.

  3. bash is one of my favorite things to write scripts in. If I’m honest, I probably have overused it for things but it’s so fun to mess with and I like the challenge of writing things in bash that it was absolutely not designed for.

  4. The trap part mentioned, but did not show the most useful one: creating a temporary work directory, that is automagically removed when the script exits:

    #!/bin/bash
    Work=”$(mktemp -d)” || exit 1
    trap “cd / ; rm -rf ‘$Work'” EXIT

    The use of double and single quotes means that the command is evaluated at that time, so even if one changes the value of the Work variable later on, only the original directory will be removed. The change-to-root ensures that the subtree can be removed even if the script changes the current working directory there, and it happens to be on some weird filesystem like VFAT. It does no harm in the general case.

    The reason this is so useful, and more people should use the idiom in their scripts, is that the EXIT trap is triggered even if the shell is interrupted by an INT signal, which is most commonly generated by the user pressing Ctrl+C.

    Most shell scripts I write can be classed in three categories:
    – shorthand (for longer commands I don’t want to use an alias for, for example because I want to use it to open specific types of files from e.g. browsers)
    – runners (same script run on different systems/architectures or users, choosing options or parameters based on each system, desktop environment, or user)
    – sequences (as much to record the commands needed to complete some logical task, as to actually achieve them; typically chaining several tools to achieve some set of tasks)

    For data conversion and generation, I most often use awk.

    1. I use awk more than I should ;-) I have done some awful things in awk that work well but I’m not especially proud of them. Including large parts of a cross assembler and a Forth compiler.

        1. AWK is venerable – I have used it since the late 1980s. Perl attempted to replicate and extend, but couldn’t manage a human readable syntax. A bash script with one or two in-line awk segments is a fine way to avoid ever touching perl. (Just watch those single quotes. ;)

          The longest awk program I wrote was an 1800-line language translator. (Structured English to ‘C’ with an integral event scheduler, for generating a population of interconnected state machines of arbitrary variety.)

    2. @Nominal Animal +1
      I worked in a Unix shop in the 90s and learned to create shortened versions of commands to do common things like la for ls -al. I found that it was no shame to not remember all of the switches you need to do something like tar. You had to save what worked so that when you ran your weekly backups, you didn’t leave out an important step.

      I maintained batch files for years, and now maintain Linux scripts for personal use. I like being able to see and edit the code instantly.

      But, I don’t do this 8 hours a day, any more, and feel like quite the noob.
      I tried to “look up the trap command” and got “No manual entry for trap”
      Guess it’s time for Google.

  5. trap “rm -f /tmp/output; exit” INT TERM EXIT

    If you are lucky, none of those files exist and you get errors. If you are unlucky, you just erased the wrong file.

    Hint: wrong file name! But that brings up a good corner case: does an error in a trap trigger the exit on error path? Causing the trap to run and error again? And again?

    1. > Does an error in a trap trigger the exit on error path?

      No, but if you trap a signal and EXIT, and the script receives a trapped signal, and the trap code has an error, the trap-due-to-signal will fail (and not exit), then the trap-due-to-exit will also run (and fail).

      In short, if you have

      #!/bin/bash
      trap “this-command-fails ; exit” INT TERM EXIT
      kill -TERM $$

      you get two identical error messages, both saying that “this-command-fails: command not found”.

      To avoid this, use e.g.

      trap “true cleanup” EXIT
      trap “exit” INT TERM

      instead.

  6. Shell coding is dangerous and a good example is the complexity of whether a failure of a command in a pipe causes an exit with ‘set -e’. Your example of ‘a | b | c’ is just the tip of the iceberg and hides the fact that other language constructs have the same issue. One example are commands that look like assignments but aren’t (like export or local); for that example read http://mywiki.wooledge.org/BashFAQ/105#line-180 That page shows the myriad of ways that bash is ready to shoot you in the back with case after case of exceptions and unexpected behavior. And you can use ‘set -o pipefail’ but then you lose the option of ‘cmd_that_can_fail | true’ to ignore failures that are acceptable and you have to set your own mental state when editing files to think about the effect of these options on the execution environment.

    I’d suggest picking up Python.

      1. “Shellcode” is an old term for an exploit (as in “injected code that will give access to a shell prompt”). What is being discussed here is usually called “shell scripting”, something completely different.

  7. I think I’ll go write up a post on Python so I can see how many people tell me I should be using bash ;-)

    I was a consultant long enough that I’m pretty tool-agnostic (except for the vi vs emacs thing Elliot and I keep rehashing; even then, I can do vi, I just don’t like it). So yeah, Python has its place. I do a LOT in C and C++ because those replaced FORTRAN and PL/I as my “native” languages. Sometimes when I want to do networked stuff, I do Java. I’ve done C#. There are a lot of factors that go into “what’s best” and it is very situational. So you like Python? Good for you. Write a post. But some people–by choice or by no choice–are going to have to use Bash. I think everyone will agree that a lot of Bash scripts you see are awful. But my point is this: they don’t HAVE to be. I can write bad code in any language (and have, probably) but because Bash is accessible and “a toy” people don’t always pay attention to it and just write simple “batch files.” when, in fact, Bash can do so much more.

    Consider C (just to avoid provoking the many Python faithful). If you said, “C isn’t suitable for anything because I only know 4 statements and two library calls, printf and scanf. What if I need variable memory? What if I need…. or what if I mess up a pointer?” Well, yeah. All that is true, but if you knew ALL the features of C and the library, you can answer some of those questions.

    Even the pointer thing amuses me. I once had a consultant tell me smugly that “Java can’t leak memory.” I told him that was incorrect and he started telling me about his credentials (which means nothing, of course). Then I showed him how a weak reference can leave objects dangling and not garbage collected. There are ways around that too, but my point is any language you don’t understand fully is dangerous to you if you use it. And sometimes some of us have to use Bash and similar scripting tools.

    1. At one point of my career (about 10 years ago, maybe more) I decided I could just avoid learning bash, awk, and sed and just learn perl — and have never regretted it — though I do use each of these on rare occasions. Remember that one of the virtues of the successful programmer is laziness. Of course I use bash every day as a shell, but we aren’t talking about that just now.

      And yes, the smug dude with credentials. I remember a really long time ago we hired a fellow who told us that we were doing things wrong and all should doing “structured programming” or whatever the current fad was (this was so long ago that he couldn’t have been telling us we should all be doing OO). The problem was that he could never produce any code and he didn’t last long.

      Python is definitely not my thing, but I think I would choose it over bash if only those two were available. When all you have is a screwdriver, you pound nails with a screwdriver.

    2. If people already have Python skills they should read “Python for Unix and Linux System Administration” and they may not need to spend the time to master bash. This is the key point, the time and effort to acquire the skills rather than extending existing ones. I find that taking this view on the topic helps to avoid endless circular arguments between fans of various languages and methods.

    3. I think shell scripting fills an important niche but it’s still a pain in the arse. And there are too many deus ex machina moments for me to enjoy reading other people’s code.

      Huh I wonder how they did X? Oh there is a command for that.

      Shell scripting isn’t a “propa language” because it requires its users to outsource most of the heavy lifting to another programme. Drawing a thin distinction here from using a library because I could write a library in c I can’t write one in pure shell script(no commands allowed).

      Still I don’t think all languages should be self hosting or even capable of that.

  8. Seriously, Can we get some kind of trigger warning? ;-)

    It seems like every time there is an article that talks about a particular tool it kicks off some variant of the “emacs vs. vi” rant-chain in the comments. So you use something else, eh? Great, carry on! I believe the spirit of the article was to point out that if you use bash, here are some ways to , *potentially*, use it better.

  9. This is pretty great, in fact it’s what I think these articles should be like – not just offering information that can be easily found elsewhere, but adding value by condensing the author’s practical experience and neat, not-well-known aspects of the topic. Thank you :)

  10. I think I’ve missed something… how is

    if [[ $X > 0 ]]
    then …
    fi

    any “better or neater” than

    if [ $X -gt 0 ]
    then …
    fi

    ???

    I believe you when you say that bash has better ways of handling things than older script syntax, but this example just seems.. ridiculous?

    1. The biggest advantage to the [[]] syntax is you can evaluate expressions very easily including regular expressions. You also get protection from word splitting and path name expansion which is usually a good thing.

      For example:

      if [ -f $FILE]

      That breaks if $FILE has spaces in it (use -f “$FILE”).

      But

      if [[ -f $FILE ]]

      works fine.

      Don’t forget (( )) which lets you do math, too…

      if (( 5 + 2 == 7 )); then echo Yes ; fi

      Probably makes more sense if at least one of those numbers is a variable ;-)

      Good read: http://mywiki.wooledge.org/BashFAQ/031

  11. While I agree that I’d reach for Ruby rather than sh (and remember that many bashisms don’t work on dash, the default shell in Debian and Ubuntu these days), I’ve accrued enough sh over the years that I can use it for some things. Even so, almost everything in the article is new to me so thank-you for pitching at the perfect level.

  12. Why is this a “Linux” thing? Are we the lowest common denominator here and don’t know any better, so we have to resort to catchy marketing words?

    Bash existed for years before the Linux kernel was created. Bash isn’t a Linux thing.

  13. It appears to be assumed that we old-timers who have used *nix for three decades or more are extinct.
    Heck, if you want to talk about a shell not to script in, then just hark back to Csh.

  14. One thing that I noticed a lot of people don’t know..
    On linux / bin/sh is mot of the time an alias to Bash
    But on most unix, it’s a different (bourne) shell!
    So don’t use #!/bin/sh in your scripts, but rather #!/path/to/bash…

  15. Regarding bash vs other programming language, this how it is done in Google. Quote from https://google.github.io/styleguide/shell.xml : “Shell should only be used for small utilities or simple wrapper scripts. […] Some guidelines:
    – If you’re mostly calling other utilities and are doing relatively little data manipulation, shell is an acceptable choice for the task.
    – If performance matters, use something other than shell.
    – If you find you need to use arrays for anything more than assignment of ${PIPESTATUS}, you should use Python.
    – If you are writing a script that is more than 100 lines long, you should probably be writing it in Python instead. Bear in mind that scripts grow. Rewrite your script in another language early to avoid a time-consuming rewrite at a later date.”

  16. This is awesome! I didn’t know about the nounset directive and I’m using it now.

    There is a typo in that section, though. You article says “nonunset” instead of “nounset”. There’s an extra “n”.

    Thanks for the great tips!

  17. There are a lot of questionable practices advocated in this article. I’ll enumerate them by heading.

    Point 1: Normal (non-bash) shells can do string ordering too. Just pass ‘>’ to the test command as the operator. If you don’t quote it, it’s a redirection operator, so bear that in mind. (Bash’s behaviour here is a weird special case which makes things easier to write for newbies at the expense of being less logically consistent. Using bash’s behaviour makes your script less portable.)

    Point 2: errexit, nounset and readonly are all problematic. Errexit is a hack that sometimes works but fundamentally is barking up the wrong tree (assuming that all nonzero exit codes are errors, assuming that functions should not be checked when called from an if-statement, etc). Bash is not a language that supports exception handling. Do not try to force some weird pseudo-exception-handling hack onto it and expect it to work. Nounset is similar, but really you should be using the ? operator (in parameter expansion) to achieve the same in a more controlled way. Readonly is one of those YAGNI things. There’s no illusion of encapsulation in bash, and only one variable namespace. If you make a variable readonly, you can’t even assign to a different local variable of the same name. It’s a mess.

    Point 3 is really only errexit nonsense, but it has 2 things worth pointing out. First, it mentions pipefail. This is a bash-only option and should be explained as such. This will negatively impact the portability of your scripts. And if you didn’t want portability, why are you writing a shell script? Second, it mentions the old [ “$?” -eq 0 ] test. This is a bad idea. Almost every time someone writes this, they actually meant to put the command inside the if statement. The fact that it is taken seriously in this article (and that the only alternative presented is the || operator) suggests that the author is a beginner at bash scripting, which makes me wonder why he is teaching beginner mistakes to people.

    Points 4/5 are covered above.

    Point 6: The road to hell is paved with good intentions. While you quote variables in some places there, you leave them conspicuously absent from others. Why? It’s either a misunderstanding or an oversight, and both go against the point you are trying to make. (Also, it’s probably out of the scope of the article, but echo should not be used to print variables, and variables should not be passed to commands without some kind of safety like the double-dash or prepending a “./”)

    Point 7 — the stuff here is good, moving on.

    Point 8: You mention a problem with sudo but not the solution. Use the double dash argument, it saves lives. sudo -u protuser — “$@”

  18. Lots of good info here but the problem I have this article is that most of it is how to make bash not do bad things which kinda disproves the point.
    (with “trap” doesn’t a param signal of 0 catch all exits?)

Leave a Reply to Mike BurkeCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.