![]() |
|
This thread is locked; no one can reply to it.
![]() ![]() |
1
2
|
Script kiddies! |
Chris Katko
Member #1,881
January 2002
![]() |
Post useful scripts for Windows/Linux that you've written! I was working on one last night that uses nbtscan (NetBIOS scan, Win/Linux) to scan for all computers on a network, show their workgroup, and mac address. This way I can easily tell how many computers are NOT joined to the domain like they should be. It computes the network range/CIDR from your IP and subnet and then runs nbtscan and then parses the output. I could also run nmap the same way. I just went back and used "here documents" to embed the supporting AWK scripts into the bash script. Pretty fun stuff. 1# State restoration
2# ==========================================================
3
4OLDIFS=$IFS
5
6# Embedded support progams
7# ==========================================================
8
9
10#AWK 1 - get cidr range from netmask
11#---------------------------------------------
12# read, set display name to '',
13read -d '' awk_range << 'EOF'
14BEGIN {
15FS = "."
16}
17
18{}
19
20END {
21cidr_range=0
22
23for (i=1; i<=NF; i++)
24 {
25# print "trying " $i " in " i " of " NF
26 if($i=="128"){
27 cidr_range+=1
28 }
29 if($i=="192"){
30 cidr_range+=2
31 }
32 if($i=="224"){
33 cidr_range+=3
34 }
35 if($i=="240"){
36 cidr_range+=4
37 }
38 if($i=="248"){
39 cidr_range+=5
40 }
41 if($i=="252"){
42 cidr_range+=6
43 }
44 if($i=="254"){
45 cidr_range+=7
46 }
47 if($i=="255"){
48 cidr_range+=8
49 }
50 }
51
52print cidr_range
53}
54EOF
55
56
57#awk 2 - parse nbtscan output
58#---------------------------------------------
59read -d '' awk_nbtscan << 'EOF'
60BEGIN { FS = ","}
61
62{
63 ip[$1] = $1 #NOTE, we use IP to access each
64
65 field[$1] = $2
66 value[$1] = $3
67
68 data[$1]["ip"] = $1
69
70 if($3 == "Workstation Service")
71 {
72 data[$1]["hostname"] = $2
73 }
74
75 if($3 == "Domain Name")
76 {
77 data[$1]["domainname"] = $2
78 }
79
80 if($2 == "MAC")
81 {
82 data[$1]["mac"] = $3
83 }
84}
85END {
86 for(x in ip){
87 print data[x]["ip"] " - " data[x]["hostname"] " - " data[x]["domainname"] " - " data[x]["mac"]
88 }
89
90}
91EOF
92
93
94# BEGIN PROGRAM
95# ============================================================
96
97#echo "255.255.128.0" | awk "$awk1"
98#nbtscan -vh -s, 192.168.1.1-254 | awk "$awk2"
99
100
101#grab the IP and netmask from ifconfig
102#-----------------------------------------
103
104
105if [ $# -eq 0 ]; then
106 echo "Using default, hardcoded interface wlan0."
107 interface="wlan0"
108else
109 interface="$1"
110fi
111
112echo "Using interface [$interface]"
113
114ip=`ifconfig $interface | egrep -oe 'inet addr:([[:digit:]]{1,3}.){3}[[:digit:]]{1,3}' | awk -F: '{print $2}'`
115
116netmask=`ifconfig $interface | egrep -oe 'Mask:([[:digit:]]{1,3}.){3}[[:digit:]]{1,3}' | awk -F: '{print $2}'`
117
118
119IFS=. read -r i1 i2 i3 i4 <<< "$ip"
120IFS=. read -r m1 m2 m3 m4 <<< "$netmask"
121network=`printf "%d.%d.%d.%d\n" "$((i1 & m1))" "$((i2 & m2))" "$((i3 & m3))" "$((i4 & m4))"`
122
123echo "ip=$ip"
124echo "netmask=$netmask"
125echo "network=$network"
126
127#Return only the network range
128#-----------------------------------------
129
130
131range=`echo "$netmask" | awk "$awk_range"`
132cidr="$network/$range"
133output=`nbtscan -vh -s, $cidr | awk "$awk_nbtscan"`
134echo "range=$range"
135echo "cidr=$cidr"
136echo "$output"
137
138IFS=$OLDIFS
Output is: Using default, hardcoded interface wlan0. Using interface [wlan0] ip=192.168.1.175 netmask=255.255.255.0 network=192.168.1.0 192.168.1.0 Sendto failed: Permission denied 192.168.1.255 Sendto failed: Permission denied range=24 cidr=192.168.1.0/24 192.168.1.196 - HPD1AA8C - MSHOME - 00:00:00:00:00:00 192.168.1.147 - TITAN - WORKGROUP - 00:af:86:22:11:a7 192.168.1.175 - SATURN - WORKGROUP - 00:00:00:00:00:00 Basically, you just run the script (optionally give it the interface name you want to use, wlan0 is default) and it'll grab your network information, compute a proper CIDR / network range (without needing another app), and then pipe that to nbtscan. I might add nmap later. Previously, I used ipcalc (another apt-get package) to produce the CIDR but it was fun replacing it with another awk script. -----sig: |
furinkan
Member #10,271
October 2008
![]() |
I have a post-install script that adds a basic set of tools on an Ubuntu or Linux Mint. Warning, this installs PHP 7. https://gist.github.com/derrekbertrand/7668a695911260dee0c8 Mounts a folder on a remote host using sshfs, pokes it every minute to make sure the connection doesn't break. Honestly, I've stopped having a need for this, as I've changed workflows. https://gist.github.com/derrekbertrand/8911d178ebe23b15670c I also have this which provisions an account on a server with nginx for the Laravel framework (mostly). Mainly for my internal use. |
bamccaig
Member #7,536
July 2006
![]() |
I need to learn awk someday. That's one tool that I'm lacking. All I know is "{print $1}" where $1 is a positional field in the input. I do know Perl though which is probably more powerful, featureful, and clean; so I would generally rely on that, but having awk would be nice when perl isn't available. When possible, you should prefer [[ over [ because it's smarter (less quoting required). I would generally quote the right-hand side of a variable assignment unless I know it can't possibly contain white-space (e.g., static value). In the case of a variable or a subcommand I'd quote it just to be safe. Generally with UNIX-like scripting language quotes nest properly without thinking too hard about it. In Windows, I'm sure you're aware that it's a brainfuck... You just taught me about here-strings! I don't particularly have any scripts in mind at this time. Much of my scripting ends up in my rc repo: https://github.com/bambams/rc (i.e., .bash.d.source, .bash.d). Depending on context I sometimes put them into other repos. For example, git and mercurial scripts have their own repos. Those tend to be very ugly and hacky scripts that go stale if they aren't practical. Here's an example of a simple script that defines a function which allows me to interact with wifi connection settings from a machine that doesn't have automatic wifi setup. It was written for my EeePC netbook running a basic Debian system that booted into text mode where I'd start X manually running xmonad as the window manager. With some magic in /etc/sudoers I was allowed to change a symlink in /etc/network/interfaces.d without a password which is where I'd manually add configuration for networks I knew about and used (generally, only home, and places I'd spend the night). Since I am always working from a command line I generally wrap repetitive tasks into a smart command such as this so that I can type a lot less and don't have to remember the intricate details.
1wifi() {
2 local interface=wlan0;
3 local dir="/etc/network/interfaces.d";
4 local symlink="${dir}/${interface}";
5
6 export WIFI_DEFAULT_NETWORK="${WIFI_DEFAULT_NETWORK:-tiny-bronco}";
7 export WIFI_NETWORK_PATH="${WIFI_NETWORK_PATH:-$HOME/.network.wlan0}";
8
9 local current="$(cat "$WIFI_NETWORK_PATH")";
10
11 case "$1" in
12 connect|on|up)
13 sudo /sbin/ifup "$interface";
14 ;;
15 disconnect|down|off)
16 sudo /sbin/ifdown "$interface";
17 ;;
18 goto|switch|use)
19 local network="${2:-WIFI_DEFAULT_NETWORK}";
20 local path="${dir}/${network}.${interface}";
21
22 sudo ln -fs "$path" "$symlink" || return 1;
23
24 if [[ $network != $current ]]; then
25 echo "Changing to $network.";
26 echo -n "$network" > "$WIFI_NETWORK_PATH";
27 wifi off;
28 wifi on;
29 fi;
30 ;;
31 list|ls)
32 sudo ls "${dir}/" | sed -e 's/\.wlan0\b//g';
33 ;;
34 query|which)
35 echo "$current";
36 ;;
37 re|restart|reup)
38 wifi off && wifi on;
39 ;;
40 st|status)
41 /sbin/ifconfig "$interface";
42 ;;
43 *|'?'|help)
44 cat <<USAGE | "${PAGER:-less}";
45wifi [SUBCOMMAND] [ARG...]
46
47Subcommands:
48
49 connect (alias: on, up)
50
51 Args: none.
52
53 Attempt to connect the wireless interface using
54 the selected configuration.
55
56 disconnect (alias: down, off)
57
58 Args: none.
59
60 Attempt to disconnect the wireless interface.
61
62 goto (alias: switch, use)
63
64 Args: configuration file name (excluding the extension).
65
66 Change the selected configuration. Configurations
67 must already exist within the configuration
68 directory ($dir).
69
70 If the configuration is changed then we
71 automatically attempt to disconnect from an
72 existing interface and connect using the new
73 configuration.
74
75 help (alias: ?) (default)
76
77 Args: none.
78
79 Print this message to the pager (default pager:
80 less).
81
82 list (alias: ls)
83
84 Args: none.
85
86 List available configurations.
87
88 query (alias: which)
89
90 Args: none.
91
92 Print the name of the currently selected
93 configuration (if known).
94
95 restart (alias: re, reup)
96
97 Args: none.
98
99 Disconnect and connect.
100
101 status (alias: st)
102
103 Args: none.
104
105 Print the ifconfig(1) of the network interface.
106 The format is not defined by this command. It is a
107 simple shell-out (as are most commands).
108USAGE
109 ;;
110 esac;
111};
-- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Gideon Weems
Member #3,925
October 2003
|
bamccaig said: I need to learn awk someday. That's one tool that I'm lacking. Same here. I've always been able to get by with bash, Python, and of course non-awk coreutils... I know there are a few sed/grep lines I've written that would have been more elegant with awk. Maybe someday. I have no scripts good for sharing, but I would like to share a simple tip: Check out the zenity, dialog, and osd_cat packages of your distribution. |
Chris Katko
Member #1,881
January 2002
![]() |
I always put off awk till last night. It's actually super easy and has a C-like syntax. You have: START {} Where start and end are optional header/footer blocked executed once before and after. Otherwise, everything is executed per field. You can set up the field-separator (FS) to be whitespace (default), or commas or dots, etc. There are some strange things. $0 is the whole line. $1 is the first field. $2 the second. HOWEVER, variables do NOT have dollar signs like bash. Dollar signs are only for special fields. I kept running into problems with that initially. NF is number of fields, hence the for loop between 1 and the number of fields (0 being ignored because it's all fields.) Disclaimer: I've only started learning it so there may be better ways, or incorrect nomenclature used. -----sig: |
bamccaig
Member #7,536
July 2006
![]() |
Nitpick mode activated. I realize that you didn't necessarily write all of this, and it's purpose is limited so the code doesn't have to be perfect, but we're here so I figure we might as well point out improvements that can be made.
17# Modify the following to match your system
18NGINX_CONFIG='/etc/nginx/sites-available'
19NGINX_SITES_ENABLED='/etc/nginx/sites-enabled'
20PHP_INI_DIR='/etc/php5/fpm/pool.d'
21WEB_SERVER_GROUP='www-data'
22NGINX_INIT='/etc/init.d/nginx'
23PHP_FPM_INIT='/etc/init.d/php5-fpm'
To make it so that you don't have to modify the source itself you can make those environment variables inherited from the environment, and defaulted if unset. That's how I'd prefer to do it. That way the source doesn't have to change at all and the actual "config" can be put in a separate file sourced by your shell. VARNAME="${VARNAME:-default_value}"; If they aren't environment variables then it's probably a better idea to make them lowercase so that they'll be less likely to collide, and they'll stand out differently too. VARNAME=value; export VARNAME # vs varname=value;
25SED=`which sed`
There's basically no value in doing this (unless maybe you're going to modify PATH and want to make sure it doesn't screw with things). An absolute path can be useful for processes running with elevated privileges, but only if the absolute path is known. Querying the system for the absolute path is no different than just letting the system resolve it.
41PATTERN="^(([a-zA-Z0-9]|[a-zA-Z0-9][a-zA-Z0-9-]*[a-zA-Z0-9])\.)*([A-Za-z0-9]|[A-Za-z0-9][A-Za-z0-9-]*[A-Za-z0-9])$";
Note that this relies on $ being literal because of what follows. It's probably acceptable, but it made me question it. Using a strategy like Chris did to interpolate parts, especially repeating ones, might make it easier to understand. This is a pretty common way to simplify regular expressions. wrdchr_re="a-zA-Z0-9"; bndchr_re="[${wrdchr_re}]"; midchr_re="[${wrdchr_re}\-]" domprt_re="(${bndchr_re}|${bndchr_re}${midchr_re}*${bndchr_re})"; domain_re="^(${domprt_re}\.)*${domprt_re}\$"; I chose somewhat obscure names just to make them short to keep the composition in a single line/terminal. Unless you're targeting various shells with varying capabilities you can probably reduce this to: read -p "Would you like use ssl on this site (y/N)?" USESSL Alternatively, I'd at least use "echo -n" so that the response was on the same line as the prompt.
63if [ $USESSL == "y" ]; then
The thing that should be quoted, $USESSL, is not and the thing that doesn't need to be quoted, the y, is. Lastly, I'd reiterate quoting of variable expressions and subcommands. Append: Thanks, Chris! That pretty much takes the mystery out of the first awk program. I'm speculating that what looks like array references in the latter is actually storing data into a dictionary, but I'm not sure? It seems like it would help an awful lot to understand those awk programs if you began them with a comment describing the input format as best as you understand it, or even offer a short sample input for reference. For somebody familiar with the input source it's probably not very necessary, but since I've never used nbtscan I have no idea what to expect. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
furinkan
Member #10,271
October 2008
![]() |
@bams: No, I only wrote like 20% of that. The original was old and did the job, but I wanted a little more functionality out of it, and I wanted to tweak some things to suit my needs. More importantly, with some fidgeting, it will properly set up an SSL cert. So yeah, If I were to rewrite it myself, I'd clean it up substantially. But you can't complain about free shit on the internet. EDIT: Now that I'm looking closer, the latter half of your post was written by me. Lets just put it out there that bash is not my strong suit. |
bamccaig
Member #7,536
July 2006
![]() |
Quote: But you can't complain about free on the internet. Disagreed. furinkan said: Lets just put it out there that bash is not my strong suit. Bash (and sh-variants in general) tends to be something that people don't start out caring about, but eventually their usefulness works their way into your life. I gradually developed a somewhat intermediate level of skill hacking in bash. If you do choose to do any hacking in bash I recommend joining #bash on freenode to ask for criticisms or help. They are extremely helpful, extremely wise, have a ton of readily linkable best practices at their disposal, and they can seriously improve the quality of your bash code and give you a lot more confidence in it. I also have experience being put onto a system with an ancient sh variant. You don't know how much you'll miss the bells and whistles until they're gone. It was still refreshing to see how many features were still supported even then, but it was still a bit of a fight to get comfortable with it. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Chris Katko
Member #1,881
January 2002
![]() |
Another thing I learned last night: apt-get install most export PAGER=most most lets you use color highlighting in man pages! [edit] Okay, so the second awk. awk supports all kinds of crazy hash-based arrays. You can give it any value and it translates with a hash function to an index. This was actually the first awk program I made with arrays too. The begin sets the FS (field separator) to comma. Then, for each line: (field and value I think are old versions as I was debugging.) data is indexed with IP values that are auto-converted with a hash. data[192.168.32.1]["ip"] = 192.168.32.1 then if nbtscan 1nbtscan -vh 192.168.1.1-254 -s,
2
3Outputs
4
5192.168.1.175,SATURN ,Workstation Service
6192.168.1.175,SATURN ,Messenger Service
7192.168.1.175,SATURN ,File Server Service
8192.168.1.175,__MSBROWSE__,Master Browser
9192.168.1.175,WORKGROUP ,Domain Name
10192.168.1.175,WORKGROUP ,Master Browser
11192.168.1.175,WORKGROUP ,Browser Service Elections
12192.168.1.175,MAC,00:00:00:00:00:00
13192.168.1.147,TITAN ,Workstation Service
14192.168.1.147,WORKGROUP ,Domain Name
15192.168.1.147,TITAN ,File Server Service
16192.168.1.147,WORKGROUP ,Browser Service Elections
17192.168.1.147,MAC,00:19:86:81:11:a7
18192.168.1.196,HP9CB6545A1226 ,Workstation Service
19192.168.1.196,MSHOME ,Domain Name
20192.168.1.196,HP9CB6545A1226 ,File Server Service
21192.168.1.196,HPD1AA8C ,Workstation Service
22192.168.1.196,HPD1AA8C ,File Server Service
23192.168.1.196,MAC,00:00:00:00:00:00
So we then take each IP and based on the third column, we decide what to do with it. If it's workstation service, it's the computer name. Domain Name is the domain name / work group. However, if the SECOND column is MAC, we use the third column for the data. All straight forward stuff just ripping the data from the nbtscan output. So for each line, we throw the data into a data structure. And then when we're done, we just dump it all out in a tabular format. And that first AWK is just taking a subnet string (delimited by periods) and converting each field into to bits taking advantage of the fact you can't have any zeros inbetween ones in a subnet mask (11111000, never 01001101) so we don't really have to convert to binary and can just match the 8 possible cases for each field. [Of course, this is IIRC. If subnet masks CAN have zeros in the middle this will break apart.] [edit] I actually wasn't sure whether to comment the script as a kind of tutorial or not. I could add some more if people really want it. -----sig: |
bamccaig
Member #7,536
July 2006
![]() |
Before you all start changing PAGER to most: http://unix.stackexchange.com/a/81131. The Web site for it is incredibly bare. Most ... of it is "under construction". It's basically undocumented from the looks of things (note: I haven't installed it). In any case, less is pretty featureful already so I'm not really in need of a better pager. And apparently most does lack a few less features. Still, thanks for spreading the knowledge, Chris. I'll be watching for it next time it comes up to see if the game has changed. In the meantime, as with probably most people, LESS is set to FRSX. If yours isn't Google why maybe it should be. Thanks for the explanation of the 2nd awk program. It makes sense. It also helps me to see where inspiration for some Perl features may have come from. As for subnets I think it's guaranteed to be 1's and then 0's (never mixed). The subnet is supposed to tell you which bits of the address are network address versus host address. And it wouldn't make sense to mix them up. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Chris Katko
Member #1,881
January 2002
![]() |
I just realized yesterday that awk can actually do much more: BEGIN {} pattern{} pattern{} ... END{} Where pattern can actually be a regular expression: /taco/{} Or a variety of functions: match($0, "/taco/", x) #check whole line, $0, could also have done only $1/$2/etc #regex #x - the index of the first match { print "found at", x } Also, NR is the number of records... when it's in END. But in the normal lines? It tells you WHICH line number you're on. Both of those really let you get up-and-running with logic for each label very quickly instead of having to do some more complex, general "String compare" switch statement. -----sig: |
Erin Maus
Member #7,537
July 2006
![]() |
Awk can do many things. There's nothing it can't do, or nothing that would surprise me, after seeing someone make a raycaster with it... But of the good bits of Unix, I think shell scripts are... terrible? They're a great idea, but the implementation is straight out of the 80s at best (and not in a good way). It probably wouldn't be too difficult to make a simple shell that leverages a powerful scripting language for personal use. For example, I somewhat like the idea PowerShell (programs/actions being objects), but the syntax is even worse than any popular Unix shell... I've thought about it, but I can barely get out of bed so it's definitely out of my league right now. (A shell that properly embraces and extends my favorite scripting language, Lua, would be pretty neat...) --- |
Bruce Pascoe
Member #15,931
April 2015
![]() |
The problem with that is that a shell using a proper scripting language would be a bitch to use manually (i.e. entering commands in the terminal). Imagine using the Python or Node.JS interpreters as an everyday shell! Unix shells essentially need to serve two masters: They need to be Turing complete so you can automate anything with a shell script, but also have a compact syntax so manual commands can be entered quickly. Besides, you can already do what you want by adding a shebang to the top of the script so that the OS will run it using the proper interpreter. For example, you could write a Python script starting with #!/usr/bin/python and run that like you would any other shell script. Since you specifically mention personal use, installing Python (or Lua, or whatever) shouldn't be an issue.
|
bamccaig
Member #7,536
July 2006
![]() |
On the contrary, Unix shell languages are timeless. They're more likely to be straight out of the 70s with most features and the 60s for others, but I digress. This was at a time when they understood well how best to communicate with computers. It's pretty interesting to see just how many things they got right compared with future generations and how their ideas from those origin days still remain the best ideas today. PowerShell sounds like a great idea in theory. Communicate with objects? How cool! The idea really breaks down when you realize how complicated that makes everything. The brilliance of the standard stream interface is that any two programs can talk to each other regardless of whether or not they were designed to do so. In fact, they could have been written in completely isolated systems without ever knowing of each other's existence, and later you can interface them together easily either directly (a | b) or indirectly (a | c | b), if necessary. My understanding of the PowerShell interface is that programs essentially need to be explicitly written with specific object interfaces in and out. While nothing should theoretically prevent you from writing that c program to transform one object into another one, consider the verbosity of statically-typed OOP and how that's supposed to fit onto a command line! In practice, c is usually not some custom program a_to_b, but rather it's awk ..., sed ... or perl ... oneliners hacked right there on the command line in 30 to 60 characters or so. Certainly more lengthy solutions exist, and often they are indeed written to disk as a separate command (or module), but the power exists because the communication interface for every program is a stream of bytes (and that stream could very well represent an object, a file, text, or anything else). Since I am predominantly a command-line user I always operate from a command shell when I can, even in Windows. A few years back I made the conscious choice to just bite the bullet and switch from cmd.exe to PowerShell. And I spent a good 6 months exclusively in PowerShell trying to figure out how to use it effectively (note: cmd.exe has always been pretty terrible so I was very anxious for something better). I still had MinGW/MSYS in my PATH so I was able to use the UNIX-like tools from the PowerShell console. What I found was that learning PowerShell commands and invoking them was extremely difficult compared to the UNIX-like tools I already had at my disposal. There was no benefit to learning the PowerShell commands, and in fact they lacked in power compared to what the standard shell interfaces already had. The console program (i.e., the window/emulator/etc.) itself was very disappointing in that it had no features that cmd.exe didn't have. PowerShell cmdlets are not just a man page or Google away from figuring out some command and wiring it up. Instead, it is a painful process of trying to learn and understand an entire custom API for every command, and battling frustratingly with the fact that they just can't do certain things. IIRC, there's no proper concept of STDERR. They had some other custom error thing that worked way differently. It just made for very clumsy, looooooooong commands that barely worked. Things that you'd expect to work don't. And the shell lacked features that UNIX shells have had for nearly 50 years... I doubt that has improved much in the past 5 years. In general, it's a massive failure in my eyes. It has been a long while since I used it or cared so it's possible that things or at least documentation has changed, but for a taste of what I mean see here: http://stackoverflow.com/questions/4998173/how-do-i-write-to-standard-error-in-powershell. The friendly PowerShell users trying to help just don't grasp the simplicity of what this guy is trying to do. Eventually, they're basically like, "well, you can't do that, ... but this is close...?" I don't recall now where I got this idea, but I seem to recall reading that Microsoft was never setting out with the knowledge of UNIX-like shells to do a better job of it. As usual, they wanted incompatibility and vendor-lock in. Their motivation was apparently to write a brand new shell to become a standard of sorts. For embedded devices or CMOS systems or who knows what... You'll note before PowerShell came "Windows Script Host" which allowed for "shell scripting" to be done in VBScript or JavaScript largely dependent on another Redmond atrocity, COM objects. These are terrible interfaces, despite JavaScript actually being a pretty great programming language. I'd argue that if you find bash somehow deficient in features or clumsy then you just haven't spent enough time with it. This would be a fantastic thread to pose some questions or complaints and see if our collective knowledge can enlighten you. Append: Aaron Bolyard said: Awk can do many things. There's nothing it can't do, or nothing that would surprise me, after seeing someone make a raycaster with it... Wow. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Erin Maus
Member #7,537
July 2006
![]() |
Bruce Pascoe said: The problem with that is that a shell using a proper scripting language would be a bitch to use manually (i.e. entering commands in the terminal). Imagine using the Python or Node.JS interpreters as an everyday shell! Unix shells essentially need to serve two masters: They need to be Turing complete so you can automate anything with a shell script, but also have a compact syntax so manual commands can be entered quickly. I figured there would be a "lightweight" input mode similar to current shells that's suitable for most basic commands, like redirecting input and piping and all. Anything a bit more advanced would let you switch into a pseudo-REPL mode with the syntax of the language... I often enough write one-off commands that are incredibly cumbersome to type/parse (but are so specific at that moment writing a shell script would be pointless) that having a more sane syntax would be nice... Of course, for most other tasks, a lightweight input mode would be fine, but there's a use case for a slightly more verbose, but more readable, method of entering commands. --- |
Chris Katko
Member #1,881
January 2002
![]() |
I just started touching Powershell a few months ago. I really liked it. It was a god-send trying to debug an exchange server. Grab all messages in the queue, filter them by these fields, and send then to the screen in a nice table. It worked great. I did some other stuff too. Nothing needed hardcore conversions like my typical awk program though. And EVERYTHING was self-documenting, including the data structures. The thing is, powershell can link to almost everything that a GUI can in windows now. COM, WMI, and .NET all can be called. Microsoft provides a unified interface, and powershell easily links up to those interfaces. It also easily returns class objects with methods, and can easily be linked up with .NET sub-programs. To Microsoft's credit, Windows has very clear boundaries for their interfaces so adding a new interface or scripting language is a simple, entirely encapsulated job. However, to Bam's issue, scripting tends to involve badly written programs (or programs used outside their intended purpose), and using only serialized data lets you survive that (BUT NOT ALWAYS) and keep going. However, we're also missing out because we HAVE to constantly write scripts that take that serialized data, unserialize it to work on it using REGEX, and then re-serialize it for the next program. At BEST we're running either each-line-is-field, or CSV where each row is a UNIT of fields separated by commas. But what happens when a program spits out data every TWO lines? Or worse, sometimes two, and if there's more data available, THREE or more? What happens when we're dumping multi-line text with special characters? (My living nightmare happens.) I JUST had to write an awk program to parse nmap's output for host discovery. It dumps the host name, it dumps the type of "I'm here!" acknowledgement found (ARP vs TCP, etc), and IF it finds a MAC address it dumps that too. All of that crap is inbetween lines of description text and thanks to the MAC, the number of lines per "object" is variable! They actually say DON'T parse their text output in their man page. You should actually be using... and here it is... their XML output option. OH WAIT. How the hell would you get BASH, a language without methods and member variables (AFAIK) to parse XML? So not only do you have this huge productivity hit from the impedience mismatch. But any data you don't intentionally capture, is lost. And since shell programs can output differing data you might not even SEE some of the edge cases when you're building your data capture script. To me--and I'd actually be willing to put work into doing this--the ideal Bash world would have the main syntax unchanged for compatibility-reasons, but an OO extension that directly supported XML, JSON, and classes with built-in serializer/deserializers as constructors or something. (So you can write your own, but they're clearly marked as entry and exit points in the class structure.) I see no reason why most UNIX commands shouldn't have a -JSON command. That format is not going away and it's way more legible and space-efficient than XML. Although, MOST programs don't even have an XML format. Also, I think all programs should have a SCHEMA section of the man page but Powershell is so amazing you can just store the output in to a $object and then just start poking around and reading the object's fields to learn the output. To me, PowerShell is much more "new user" friendly. You don't need a manpage for a lot of things the way Bash scripting does because the OO-nature lets you gleam a lot more information from a proper, rigid, known architecture. "Oh. Those are the methods, those are the variables and they're all clearly named." as opposed to "Oh god, what is this gibberish text screen supposed to mean?" Every one of us has seen a 300-column CSV where you constantly have to look back at the header line to see what field it is until you eventually give up and load a GUI CSV parser. (But when Excel/OpenOffice crash on very-large CSVs, then you're stuck with strange, crappy, shareware CSV readers. That happened, it sucked.) Lastly, one thing that surprises me is how many people consider C++ a good language, and OO proper modern methodology, but would never consider using it in Bash/command-line scripting. Now, I'm not saying it should ever be REQUIRED, but as an option, I can't think of any good reason to oppose it. -----sig: |
Erin Maus
Member #7,537
July 2006
![]() |
Tangentially related, but when I started using FreeBSD as my primary operating system, I only had passing experience with shells (mostly Msys on Windows, funnily enough). So I found zsh to be greatly superior to bash without any prior biases.... It also helps zsh has incredible auto-magic-complete support, unlike bash, which helps a lot when working with certain programs (such as tar). --- |
Chris Katko
Member #1,881
January 2002
![]() |
zsh is pretty magical. I haven't been able to get into it yet except an hour or two. I can't decide whether I like it or not though. ALSO, I forgot to mention: How do you serialize a program when it's written so poorly that it dumps error messages to STDOUT instead of STDERR? (The horror... the horror...) -----sig: |
Michael Faerber
Member #4,800
July 2004
![]() |
Chris Katko said: To me--and I'd actually be willing to put work into doing this--the ideal Bash world would have the main syntax unchanged for compatibility-reasons, but an OO extension that directly supported XML, JSON, and classes with built-in serializer/deserializers as constructors or something. (So you can write your own, but they're clearly marked as entry and exit points in the class structure.) That made me think of Haskell. I found it to be very nice for scripting tasks harder than Bash one-liners. And it supports JSON serialization via the "aeson" library. (However, the syntax is quite different from Bash. But after some time, I've come to appreciate that. ^^) Yesterday, I wrote a Haskell script that reads an HTML table with years in some lines of the first column, and groups all lines in the second column that belong to a specific year. Then, it converts every year group into a <h2> and every table line into an enumeration element. Oh, and it also reverses the order of years. And it reformats the resulting HTML code to be nicely indented. 1import Data.Maybe
2import Text.HTML.TagSoup
3import Text.HTML.TagSoup.Tree
4
5main =
6 (renderTags . withTree transform . parseTags) <$> readFile "chronik.html"
7 >>= putStrLn
8
9emptyTag (TagText t) = all (\ x -> x `elem` " \n\t") t
10emptyTag _ = False
11
12pruneTags = filter (not . emptyTag)
13
14formatTags delta = go 0 where
15 go off (t : ts) = let (pre, post) = d off t in
16 TagText (replicate pre ' ') : t : TagText "\n" : go post ts
17 go off [] = []
18
19 d off (TagOpen _ _) = (off, off + delta)
20 d off (TagClose _ ) = (off - delta, off - delta)
21 d off _ = (off, off)
22
23withTree f = formatTags 2 . flattenTree . f . tagTree . pruneTags
24
25transform =
26 transformTree makeHeaders .
27 transformTree splitYears .
28 transformTree flattenTd
29
30breakIter f (x:xs) = let (pre, post) = break f xs in (x : pre) : breakIter f post
31breakIter f [] = []
32
33
34getTrYear (TagBranch "tr" _ (TagBranch "td" _ [TagLeaf (TagText s)] : _)) =
35 if length s == 4 then Just s else Nothing
36getTrYear _ = Nothing
37
38makeHeaders (TagBranch "yr" _ trs) =
39 [ TagBranch "h2" [] [TagLeaf $ TagText $ fromJust $ getTrYear $ head trs]
40 , TagBranch "ul" [] $
41 map (\ (TagBranch "tr" _ [_, TagBranch "td" _ td]) -> TagBranch "li" [] td) trs
42 ]
43makeHeaders x = [x]
44
45splitYears (TagBranch "tbody" atts inner) =
46 map (TagBranch "yr" []) $ reverse $ breakIter (isJust . getTrYear) inner
47splitYears x = [x]
48
49flattenTd (TagBranch "td" _ [TagBranch "p" _ s]) = [TagBranch "td" [] s]
50flattenTd x = [x]
-- |
bamccaig
Member #7,536
July 2006
![]() |
Chris Katko said: I JUST had to write an awk program to parse nmap's output for host discovery. It dumps the host name, it dumps the type of "I'm here!" acknowledgement found (ARP vs TCP, etc), and IF it finds a MAC address it dumps that too. All of that crap is inbetween lines of description text and thanks to the MAC, the number of lines per "object" is variable! They actually say DON'T parse their text output in their man page. You should actually be using... and here it is... their XML output option. OH WAIT. How the hell would you get BASH, a language without methods and member variables (AFAIK) to parse XML? Nmap::Parser! Tada! Perl has you covered (untested). Seriously, considering the nature of the work that you're constantly doing you should give Perl a close look. There's tons of modules for solving these kinds of problems. The motto in Perl is to not shell out (as you might in bash), but rather to prefer CPAN modules which already do the shelling out for you and have wrapped the command in a safety layer. Not only is there in this particular case a module that already knows the schema for nmap, but there are also modules for parsing XML or JSON with varying degrees of complexity or simplicity as required for cases where no module exists. For simple cases you could do a one-liner, and for more complex cases write an actual program. Perl isn't the only such platform that will have modules like this, but it does have a long history of being a sysadmin's best friend so there will likely already be many modules for that space. And where there isn't you can create your own! Chris Katko said: To me--and I'd actually be willing to put work into doing this--the ideal Bash world would have the main syntax unchanged for compatibility-reasons, but an OO extension that directly supported XML, JSON, and classes with built-in serializer/deserializers as constructors or something. (So you can write your own, but they're clearly marked as entry and exit points in the class structure.) I think you're asking too much of bash. Its primary purpose is as a shell language. Invoking commands and wiring them up to the user, file system, and other commands. It's not meant to do everything the best. There are existing tools that will always do that better than bash ever could. And really, by the very nature of a command shell, you can already "extend" the bash shell by just writing new commands that do what you want. For example, write a command that allowed you to simply parse and process XML data. Chris Katko said: I see no reason why most UNIX commands shouldn't have a -JSON command. That format is not going away and it's way more legible and space-efficient than XML. And maybe one that lets you translate an XML structure into JSON, and a separate one that allows you to easily extract data from JSON. Google first, these probably exist. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Gideon Weems
Member #3,925
October 2003
|
Aaron Bolyard said: So I found zsh to be greatly superior to bash without any prior biases. This is the consensus. Even most people in #bash agree that zsh and fish are better designed. Nobody in his right mind, however, would argue that either is more ubiquitous--and that is where bash truly shines. |
bamccaig
Member #7,536
July 2006
![]() |
Well it appears to work. Here's a simple test script that just dumps the raw blessed objects of online "hosts". To keep things simple it separates nmap command line options from hosts (e.g., IP addresses) using the standard -- option. 1#!/usr/bin/env perl
2
3use v5.022;
4use strict;
5use utf8;
6use warnings;
7
8use Data::Dumper;
9use Encode;
10use List::MoreUtils qw/bsearchidx/;
11use Nmap::Parser;
12
13$Data::Dumper::Sortkeys = 1;
14$Data::Dumper::Indent = 1;
15
16my $sepidx = bsearchidx { $_ cmp '--' } @ARGV;
17my @args = $sepidx == -1 ? @ARGV : @ARGV[0 .. $sepidx-1];
18my @ips = $sepidx == -1 ? () : @ARGV[$sepidx+1 .. $#ARGV];
19my $args = join(' ', @args);
20
21my $np = Nmap::Parser->new();
22$np->parsescan('/usr/bin/nmap', $args, @ips);
23
24my @hosts = $np->all_hosts('up');
25
26for my $host (@hosts) {
27 print Dumper $host;
28}
29
301;
bambams@sephiroth:~$ perl nmap.pl -- castopulence.org $VAR1 = bless( { 'addrs' => { 'ipv4' => '64.85.162.126' }, 'distance' => undef, 'hostnames' => [ 'castopulence.org', 'b03s17le.corenetworks.net' ], 'hostscript' => undef, 'ipidsequence' => undef, 'os' => undef, 'ports' => { 'extraports' => { 'count' => '994', 'state' => 'closed' }, 'tcp' => { '113' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'ident', 'port' => '113', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'open' }, '22' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'ssh', 'port' => '22', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'open' }, '443' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'https', 'port' => '443', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'open' }, '554' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'rtsp', 'port' => '554', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'filtered' }, '80' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'http', 'port' => '80', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'open' }, '9000' => { 'service' => { 'confidence' => '3', 'extrainfo' => undef, 'fingerprint' => undef, 'method' => 'table', 'name' => 'cslistener', 'port' => '9000', 'product' => undef, 'proto' => 'unknown', 'rpcnum' => undef, 'script' => undef, 'tunnel' => undef, 'version' => undef }, 'state' => 'open' } }, 'tcp_port_count' => 6, 'udp_port_count' => 0 }, 'status' => 'up', 'tcpsequence' => undef, 'tcptssequence' => undef, 'trace' => { 'hops' => [] }, 'trace_error' => undef, 'uptime' => undef }, 'Nmap::Parser::Host' ); Requires List::MoreUtils and Nmap::Parser modules. It also requires Perl 5.22.xx or better, but that can be lifted by removing the appropriate use line. It will fall back on whatever the used modules require. Nevertheless, to install the modules, my advice is to install perlbrew: cpan -L http://install.perlbrew.pl | bash Follow the directions to source the environment (and add it to bashrc for later). Then install the latest stable perl (this takes a few minutes): perlbrew install --as stable stable When it's finished, assuming it went gracefully, install cpanm, switch to the new local perl environment, and install the dependencies. perlbrew install-cpanm perlbrew use stable cpanm install List::MoreUtils Nmap::Parser
You can switch perlbrew instead of use if your account doesn't depend on running administrative tasks that rely on a particular environment. To be safe, I opted for use so you wouldn't post at 2 AM screaming that servers are down and I broke your system. -- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Chris Katko
Member #1,881
January 2002
![]() |
Is perl worth learning? I've heard much about perl being a "write only" language with a very cryptic syntax. I'm sure you don't HAVE to write it cryptic but it's more like "all code you'll encounter is written as such." On the otherhand, their regex language seems very nice and I end up using it with grep/etc all the time. -----sig: |
bamccaig
Member #7,536
July 2006
![]() |
The main thing that leads to Perl appearing "cryptic" is its use of sigils (grammatic symbols on variables/expressions) to express context. Unlike most other programming languages, Perl has a concept of plural expressed in the language itself which can change the meaning of an expression; the creator is a linguist. That combined with a few other interesting syntax variations which require learning Perl to understand (i.e., unlike something like Python, you probably wouldn't be able to read it without learning the language first). For example, the following sets of two statements are equivalent: my @results1 = map { $_ + 2 } grep { /^[0-9]+$/ } @input; my @results2 = map $_ + 2, grep /^[0-9]+$/, @input; The first "parameter" to map and grep and family are code blocks, and they're magical in the sense that you can either pass a block (in braces) or an expression (note: no comma versus comma). You can think of both essentially turning into lambdas (anonymous functions) implicitly. So that one special expression is not evaluated and then the result passed into map or grep or friends, but instead is passed in as a chunk of code to be repeatedly called against the members of the list passed in. In Perl, a regular expression is implicitly applied to the default variable, $_. The grep code is equivalent to saying, "the current member matches this regular expression", i.e., is a positive integer. It can also be weird to a newcomer how the use of parenthesis is optional in a function call, as above. You could make the parenthesis explicit if you thought it reads better (but you'd be wrong): my @results1 = map({ $_ + 2 } grep({ /^[0-9]+$/ } @input)); my @results2 = map($_ + 2, grep(/^[0-9]+$/, @input)); Note, still no commas between "arguments" in the block case. The code argument could also be passed explicitly as a subroutine reference (which is like a lambda). For example: my $pos_int = sub { /^[0-9]+$/ }; my $plus_2 = sub { $_ + 2 }; my @results = map $plus_2, grep $pos_int, @input; Whereas Python's motto is that there should only be one way to do it, as you can see, in Perl there's many ways to do it[1]. This obviously can also lead to cryptic code since the styles can vary wildly and the reader obviously needs to understand all of this stuff for the code to make sense. That said, it gives a lot of expressive power to the author. Perl is a very complex language with lots of exceptional syntax. It can sometimes be difficult to be sure that what you've tried to say is what you said. This is compounded by the fact that Perl was written as a replacement for things like awk and as such its default running mode is very relaxed. If variables don't exist they are implicitly ignored/created. If you don't quote a bareword where Perl thinks you meant a string it will magically become a string. This allows for very short, compact programs in oneliners on the command line, but also makes for difficult to debug programs. You generally need to enable strict and warnings pragmas to enforce "strict" syntax (so none of those things I just mentioned work) and to output warnings when you do something that is considered dangerous or obscure (which will print out a warning message and line number at run-time to warn you that you should change the code to be less reliant on such features). If you're willing to learn Awk then I'd argue that you should learn Perl as well. Perl is like a super-awk/grep/sed/etc. all in one. There have been many additional features over time that improve the language a lot. Perl 6 has also just been released, which is a completely different language to Perl 5 (what people normally mean when they say Perl). Perl 6 is radically different, and has some really nice features, but it's too young to really know where it stands in practice. It will be years before Perl 5 begins to fade away so it's probably best to learn Perl 5 first and then if you feel inspired to then move on to Perl 6. References
-- acc.js | al4anim - Allegro 4 Animation library | Allegro 5 VS/NuGet Guide | Allegro.cc Mockup | Allegro.cc <code> Tag | Allegro 4 Timer Example (w/ Semaphores) | Allegro 5 "Winpkg" (MSVC readme) | Bambot | Blog | C++ STL Container Flowchart | Castopulence Software | Check Return Values | Derail? | Is This A Discussion? Flow Chart | Filesystem Hierarchy Standard | Clean Code Talks - Global State and Singletons | How To Use Header Files | GNU/Linux (Debian, Fedora, Gentoo) | rot (rot13, rot47, rotN) | Streaming |
Niunio
Member #1,975
March 2002
![]() |
Not sure if it is "useful", but it is the one that I use more often. #!/bin/bash
sudo aptitude update
sudo aptitude upgrade
sudo aptitude autoclean
sudo apt-get autoremove
I know, the "Updater" should do that... ----------------- |
|
1
2
|