当前位置:网站首页>Common shell script commands (III)
Common shell script commands (III)
2022-06-27 19:53:00 【iNBC】
shell Common script commands ( 3、 ... and )
cut Split file by column
cut Commands can be by column , Instead of splitting files by line . This command can be used to process files that use fixed width fields 、CSV Files or files separated by spaces ( For example, standard log files ).
-f
Option to specify the fields to extract .-d
Option to set the separator .--complement
Options can be displayed that are not-f
Those fields specified .--output-delimiter
Option to specify the output separator .
Examples of some of the above options :
[[email protected] shell_learning]$ cat output.txt
123 456 789 999 456 789
123 456 789 999 456 789
123 456 789 999 456 789
[[email protected] shell_learning]$ cut -d ' ' -f2 output.txt
456
456
456
[[email protected] shell_learning]$ cut -d ' ' -f2,3 output.txt #2 3 Column
456 789
456 789
456 789
[[email protected] shell_learning]$ cut -d ' ' -f1-3 output.txt #1~3 Column
123 456 789
123 456 789
123 456 789
[[email protected] shell_learning]$ cut -d ' ' -f3- output.txt # 3 Column and after
789 999 456 789
789 999 456 789
789 999 456 789
[[email protected] shell_learning]$ cut -d ' ' -f-3 output.txt # 3 Column and before
123 456 789
123 456 789
123 456 789
[[email protected] shell_learning]$ cut -d ' ' -f-3 --output-delimiter "," output.txt # 3 Column and before , Each column uses "," Split and output
123,456,789
123,456,789
123,456,789
awk Advanced text processing
awk The structure of the script is as follows :
awk 'BEGIN{ print "start" } pattern { commands } END{ print "end" }' file
The following command will output the number of file lines :
awk 'BEGIN { i=0 } { i++ } END { print i}' filename
working principle
awk The command works as follows .
- (1) First, execute BEGIN { commands } Statement in statement block .
- (2) Then from the file or stdin Read a line in , If it can match pattern, Then execute the following commands Sentence block . Repeat the process , Until all the files have been read .
- (3) When reading to the end of the input stream , perform END { commands } Sentence block .
The specific process :
- BEGIN The sentence block is awk Is executed before starting to read rows from the input stream . This is an optional block of statements , Such as variable initialization 、 Statements such as the header of a printout table can usually be placed in BEGIN In the block .
- END Block and BEGIN Statement blocks are similar to . It's in awk After reading all the rows in the input stream, it is executed . Common tasks such as printing analysis results for all rows are performed in END Implemented in a statement block .
- The most important part is and pattern Associated statement block . This statement block is also optional . If not provided , By default { print }, That is, print every line read . awk This statement block is executed for each row read . It's like a for reading rows while loop , The corresponding statement is provided in the loop body .
- Every read line , awk It will check whether the row matches the specified pattern . The pattern itself can be a regular expression 、 Conditional statements and line ranges . If the current row matches the pattern , execute { } The statement in .
Mode is optional . If no mode is provided , that awk Think that all the lines are matched :
When using a without parameters print when , It will print out the current line , As shown below :
[[email protected] shell_learning]$ echo -e "line1\nline2" | awk 'BEGIN { print "Start" } { print } END { print "End" } '
Start
line1
line2
End
echo
Command writes a line to standard output , therefore awk
Of { }
Statements in a statement block are executed only once . If awk
The input of contains multiple lines , that { }
The statements in the statement block will be executed a corresponding number of times . As shown in the following example :
[[email protected] shell_learning]$ echo -e "123 " | awk '{ var1="v1"; var2="v2"; var3="v3"; print var1 "-" var2 "-" var3 ; }'
v1-v2-v3
[[email protected] shell_learning]$ echo -e "123\n456\n789 " | awk '{ var1="v1"; var2="v2"; var3="v3"; print var1 "-" var2 "-" var3 ; }'
v1-v2-v3
v1-v2-v3
v1-v2-v3
Special variables
Here are awk Some special variables that can be used .
NR
: Indicates the record number , When awk When a row is used as a record , This variable is equivalent to the current line number .NF
: Indicates the number of fields , When processing the current record , Equivalent to the number of fields . The default field separator is a space .$0
: This variable contains the text content of the current record .$1
: This variable contains the text content of the first field .$2
: This variable contains the text content of the second field .
Examples are as follows :
[[email protected] shell_learning]$ echo -e "line1 f2 f3\nline2 f4 f5\nline3 f6 f7\nline4 f8 f9 f10" | awk '{ print "Line no:"NR",No of fields:"NF, "$0="$0, "$1="$1,"$2="$2,"$3="$3 }'
Line no:1,No of fields:3 $0=line1 f2 f3 $1=line1 $2=f2 $3=f3
Line no:2,No of fields:3 $0=line2 f4 f5 $1=line2 $2=f4 $3=f5
Line no:3,No of fields:3 $0=line3 f6 f7 $1=line3 $2=f6 $3=f7
Line no:4,No of fields:4 $0=line4 f8 f9 f10 $1=line4 $2=f8 $3=f9
The following command prints out the second and third fields of each line :
$awk '{ print $3, $2 }' file
We can use NR Count the number of lines in the file :
$ awk 'END{ print NR }' file
You can add up the values of the first field in each row as follows :
[[email protected] shell_learning]$ seq 5 | awk 'BEGIN { sum=0; print "Summation:" }{ print $1"+"; sum+=$1 } END { print "=="; print sum }'
Summation:
1+
2+
3+
4+
5+
==
15
Pass the value of an external variable to awk
-v
Options , We can put the external value ( Not from stdin) Pass to awk.
example :
[[email protected] shell_learning]$ var=100000
[[email protected] shell_learning]$ echo | awk -v variable=$var '{ print variable }'
100000
There is another flexible way to pass multiple external variables to awk. for example :
[[email protected] shell_learning]$ var1="Variable1" ; var2="Variable2"
[[email protected] shell_learning]$ echo | awk '{ print v1,v2 }' v1=$var1 v2=$var2
Variable1 Variable2
use getline Read row
example :
[[email protected] shell_learning]$ seq 5 | awk 'BEGIN { getline; print "Read ahead first line", $0 }{ print $0 }'
Read ahead first line 1
2
3
4
5
Use the filter mode to awk Filter the processed rows
$ awk 'NR < 5' # Line number is less than 5 The line of
$ awk 'NR==1,NR==4' # The line number is 1 To 5 Between the lines
$ awk '/linux/' # The inclusion mode is linux The line of ( You can use regular expressions to specify patterns )
$ awk '!/linux/' # Does not contain a pattern of linux The line of
Set field separator
-F
Option the default field separator is a space . We can also specify different delimiters with options .
$ awk -F: '{ print $NF }' /etc/passwd
perhaps
awk 'BEGIN { FS=":" } { print $NF }' /etc/passwd
stay BEGIN
The field separator can be set to... In the statement block :
.
example :
[[email protected] shell_learning]$ cat output.txt
123:456:789:999:456:789
123:456:789:999:456:789
123:456:789:999:456:789
var1="123"
var2="789"
[[email protected] shell_learning]$ awk 'BEGIN {FS=":"} { "grep 123 ./output.txt " | getline; print $1,$6 } '
123 789
123 789
123 789
var1="123"
awk Built in string handler
length(string)
: Return stringstring
The length of .index(string, search_string)
: returnsearch_string
In stringstring
Where in .split(string, array, delimiter)
: Withdelimiter
As a separator , Split stringstring
, Store the generated string in the array array.substr(string, start-position, end-position)
: return return word operator strandstring
China and Israelstart-position
andend-position
As a substring of start and end positions .sub(regex, replacement_str, string)
: Regular expressionsregex
Replace the first match withreplacment_str
.gsub(regex, replacement_str, string)
: andsub()
similar . But this function replaces the regular expressionregex
Everything that matches .match(regex, string)
: Check regular expressionsregex
Whether it can be in the stringstring
Find a match in . If you can find , Return non0
value ; otherwise , return0
.match()
There are two related special variables ,
NamelyRSTART
andRLENGTH
. VariableRSTART
Contains the starting position of the matching content , VariablesRLENGTH
Contains the length of the matching content .
loop
example 1:
Terminal count display :
for count in `seq 0 40`
do
tput rc
tput ed
echo -n $count
sleep 1
done
List oriented for loop
for var in list;
do
commands; # Using variables $var
done
list
It can be a string , It can also be a sequence of values .
Iterates over a specified range of numbers (for)
for((i=0;i<10;i++))
{
commands; # Using variables $i
}
Loop until the condition is satisfied (while)
while condition
do
commands;
done
until loop
stay Bash
You can also use a special loop in until
. It's going to cycle all the time , Until the given condition is true . for example :
x=0;
until [ $x -eq 9 ]; # On the condition that [ $x -eq 9 ]
do
let x++; echo $x;
done
边栏推荐
- Array exercises follow up
- 蓄力中台,用友iuap筑牢社会级企业数智化新底座
- Garbage collector driving everything -- G1
- One week technical update express of substrate and Boca 20220425 - 20220501
- 基于STM32F103ZET6库函数外部中断实验
- Bit. Store: long bear market, stable stacking products may become the main theme
- Kotlin微信支付回调后界面卡死并抛出UIPageFragmentActivity WindowLeaked
- redis集群系列三
- Summary of submarine cable detection technology
- 经纬度分析
猜你喜欢
Blink SQL built in functions
Bit. Store: long bear market, stable stacking products may become the main theme
Crawl national laws and Regulations Database
【bug】联想小新出现问题,你的PIN不可用。
【debug】平台工程接口调试
華大單片機KEIL報錯_WEAK的解决方案
Summary of submarine cable detection technology
The Fifth Discipline: the art and practice of learning organization
Comprehensively analyze the zero knowledge proof: resolve the expansion problem and redefine "privacy security"
Array exercises follow up
随机推荐
网络传输是怎么工作的 -- 详解 OSI 模型
1029 Median
运算符的基础知识
Memoirs of actual combat: breaking the border from webshell
刷题笔记-树(Easy)-更新中
The Fifth Discipline: the art and practice of learning organization
Doctoral Dissertation of the University of Toronto - training efficiency and robustness in deep learning
openssl客户端编程:一个不起眼的函数导致的SSL会话失败问题
rust 中的结构体
《第五项修炼》(The Fifth Discipline):学习型组织的艺术与实践
Bit. Store: long bear market, stable stacking products may become the main theme
字典树(复习)
DCC888 :Register Allocation
Running lantern experiment based on stm32f103zet6 library function
实战回忆录:从Webshell开始突破边界
OpenSSL client programming: SSL session failure caused by an obscure function
驾驭一切的垃圾收集器 -- G1
工作流自动化 低代码是关键
现在网上买股票开户身份证信息安全吗?
【debug】平台工程接口调试