|
-
December 11, 2024, 02:23:58 pm
- Welcome, Guest
News:Official site launch very soon, hurrah!
Topic Summary
Posted by: Dakusan
« on: October 14, 2017, 12:20:52 am »
A function to replace variables in a file that are in the format "VARIABLE_NAME=VARIABLE_DATA". Parameters are: VARIABLE_NAME VARIABLE_DATA FILE_NAME
function ReplaceVar() { REPLACE_VAR_NAME="$1"; REPLACE_VAR_VAL=$(echo "$2" | perl -e '$V=<STDIN>; chomp($V); print quotemeta($V)' -); perl -pi -e "s/(?<=$REPLACE_VAR_NAME[ \t]*=).*$/$REPLACE_VAR_VAL/" "$3" } The real difference between this script and normal command-line-Perl-regex-replaces is that it makes sure values are properly escaped for the search+replace regular expression.
Posted by: Dakusan
« on: May 30, 2017, 08:18:38 pm »
To log all process spawns from a user: (Fill in USERNAME)
auditctl -a exit,always -S execve -F uid=USERNAME To grep for only these entries, and exclude processes: (Fill in USERID) (EXCLUDE_REGEX=A regular expression of process names to exclude. Ex: cron|dovecot)
ausearch -m ALL | perl -0777 -e 'print grep(/uid=USERID/, grep(!/REGEX/im, split(/^----$/m, <>)))' Using user searches (-ua -ue -ui -ul) for ausearch may work too, but I've found it unreliable.
Posted by: Dakusan
« on: February 16, 2017, 03:56:28 am »
Had a need today to sync file modification timestamps between plex and the actual files. Code is as follows.
#Dump from sqlite (DB should be in C:\Users\USERNAME\AppData\Local\Plex Media Server\Plug-in Support\Databases sqlite3 com.plexapp.plugins.library.db 'select file, updated_at from media_parts' | \ sort | uniq | perl -pe 's/\\/\//g' | perl -pe 's/^\w:/\./gi' | perl -pe 's/\|/\t/g' | \ grep -viP '^((DRIVES_TO_IGNORE):|\./(DIRECTORIES_TO_IGNORE)|\|$)' > NAME_CHECKS
#Find all files with their modification timestamps find -L -type f -printf "%p\t%TY-%Tm-%Td %TH:%TM:%.2TS\n" | sort > FTIMES2
#Filter out unwanted folders and file extensions. I did this as a separate command from the above line to allow filtering without having to run a find on the entire drive again cat FTIMES2 | grep -vP '\.(md5|torrent|sub|idx|nfo|srt|txt|ssa|log|db|jpg|tbn|sfv|png|cbz|rar|cbr|OTHER_EXTENSIONS)\t' | grep -vP '^./(System Volume Information|\$RECYCLE\.BIN|OTHER_FOLDERS)/' > FTIMES
#After comparing the 2 files and extracting any files that need to be updated, run this regular expression on the data to get touch commands to update the timestamps ^(.*)\t\d\d(\d\d)-(\d\d)-(\d\d) (\d\d):(\d\d):(\d\d)$ => touch -m -t $2$3$4$5$6.$7 "$1"
Posted by: Dakusan
« on: January 20, 2017, 09:05:24 am »
List joined hard links. Same links are separated by tab with a newline between each set.
find -links +1 | xargs -d"\n" ls -i | perl -e 'foreach $Line (<STDIN>) { @Cols=($Line=~/^\s*(\d+)\s*(.*?)\s*$/); push(@{$Link{$Cols[0]}}, $Cols[1]); } foreach $List (values %Link) { print join("\t", @{$List})."\n"; }'
Posted by: Dakusan
« on: June 07, 2016, 10:31:45 am »
Find all directories that do not contain subdirectories
find -type d -exec sh -c 'test `find "{}/" -mindepth 1 -type d | wc -l` -eq 0' ';' -print
Posted by: Dakusan
« on: January 30, 2016, 01:16:39 am »
Dumping a plex database. This includes: - Episode path
- Episode name
- Episode number
- Hints (string containing season and episode numbers, and some other info)
AppDirectory='/cygdrive/c/Users/Administrator/AppData/Local/Plex Media Server/'; sqlite3 "$AppDirectory/Plug-in Support/Databases/com.plexapp.plugins.library.db" 'SELECT file, title, MDI."index", hints FROM media_parts AS MP INNER JOIN media_items AS MI ON MI.id=MP.media_item_id INNER JOIN metadata_items AS MDI ON MDI.id=MI.metadata_item_id;'[/code
Posted by: Dakusan
« on: January 25, 2016, 10:09:29 pm »
Clear the buffer of a terminal in bash
echo -e '\0033\0143'
Posted by: Dakusan
« on: January 25, 2016, 09:48:13 pm »
To get the exported entries from a single dll in cygwin, create a script with the following code. It takes 1 argument as a parameter.
objdump -p $1 | grep -Pzo '(?is)^\[Ordinal/Name Pointer\] Table.*?\n\n' | grep -oP '(?<=\d\] ).*$' If you saved the script as "get_dll_exports", to run it against multiple DLLs at a time, create another script as follows
for i in $@; do echo -e "--------\n$i\n--------"; get_dll_exports $i; done Or to process multiple dlls, but output all of a single file's results on one line with the filename preceding
for i in $@; do echo -n "$i: "; get_dll_exports $i | perl -pe 's/\n/ /' -; echo; done
Posted by: Dakusan
« on: December 27, 2010, 06:48:10 pm »
Since I’ve never really been a fan of for loops in bash... here’s an alternative to the “List number of files in each subdirectory” script from above. find -maxdepth 1 -mindepth 1 -type d -print0 | xargs -i -0 sh -c "echo -n {} \" \"; find \"{}\" -type f | wc -l" The sub-shell script (sh) is required because you can’t run sub-pipes naturally within a xargs command. You can also modify the maxdepth and mindepth arguments (keep them the same) to show deeper directory sizes, but any files above that depth will of course be ignored.
Posted by: Dakusan
« on: September 28, 2009, 05:31:01 am »
First, to find out more about any bash command, use man COMMAND Now, a primer on the three most useful bash commands: ( IMO) find: Find will search through a directory and its subdirectories for objects (files, directories, links, etc) satisfying its parameters. Parameters are written like a math query, with parenthesis for order of operations (make sure to escape them with a “\”!), -a for boolean “and”, -o for boolean “or”, and ! for “not”. If neither -a or -o is specified, -a is assumed. For example, to find all files that contain “conf” but do not contain “.bak” as the extension, OR are greater than 5MB:find -type f \( \( -name "*conf*" ! -name "*.bak" \) -o -size +5120k \) Some useful parameters include:
- -maxdepth & -mindepth: only look through certain levels of subdirectories
- -name: name of the object (-iname for case insensitive)
- -regex: name of object matches regular expression
- -size: size of object
- -type: type of object (block special, character special, directory, named pipe, regular file, symbolic link, socket, etc)
- -user & -group: object is owned by user/group
- -exec: exec a command on found objects
- -print0: output each object separated by a null terminator (great so other programs don’t get confused from white space characters)
- -printf: output specified information on each found object (see man file)
For any number operations, use:+n | | for greater than n | -n | for less than n | n | for exactly than n |
For a complete reference, see your find’s man page. xargs: xargs passes piped arguments to another command as trailing arguments. For example, to list information on all files in a directory greater than 1MB: (Note this will not work with paths with spaces in them, use “find -print0” and “xargs -0” to fix this)find -size +1024k | xargs ls -l Some useful parameters include:
- -0: piped arguments are separated by null terminators
- -n: max arguments passed to each command
- -i: replaces “{}” with the piped argument(s)
So, for example, if you had 2 mirrored directories, and wanted to sync their modification timestamps:cd /ORIGINAL_DIRECTORY find -print0 | xargs -0 -i touch -m -r="{}" "/MIRROR_DIRECTORY/{}" For a complete reference, see your xargs’s man page. grep: GREP is used to search through data for plain text, regular expression, or other pattern matches. You can use it to search through both pipes and files. For example, to get your number of CPUs and their speeds:cat /proc/cpuinfo | grep MHz Some useful parameters include:
- -E: use extended regular expressions
- -P: use perl regular expression
- -l: output files with at least one match (-L for no matches)
- -o: show only the matching part of the line
- -r: recursively search through directories
- -v: invert to only output non-matching lines
- -Z: separates matches with null terminator
So, for example, to list all files under your current directory that contain “foo1”, “foo2”, or “bar”, you would use:grep -rlE "foo(1|2)|bar" For a complete reference, see your grep’s man page. And now some useful commands and scripts:List size of subdirectories: du --max-depth=1 The --max-depth parameter specifies how many sub levels to list. -h can be added for more human readable sizes. List number of files in each subdirectory*: #!/bin/bash export IFS=$'\n' #Forces only newlines to be considered argument separators for dir in `find -type d -maxdepth 1` do a=`find $dir -type f | wc -l`; if [ $a != "0" ] then echo $dir $a fi done
and to sort those results SCRIPTNAME | sort -n -k2 List number of different file extensions in current directory and subdirectories: find -type f | grep -Eo "\.[^\.]+$" | sort | uniq -c | sort -nr Replace text in file(s): perl -i -pe 's/search1/replace1/g; s/search2/replace2/g' FILENAMES If you want to make pre-edit backups, include an extension after “-i” like “-i.orig” Perform operations in directories with too many files to pass as arguments: (in this example, remove all files from a directory 100 at a time instead of using “rm -f *”) find -type f | xargs -n100 rm -f Force kill all processes containing a string: killall -9 STRING Transfer MySQL databases between servers: (Works in Windows too)mysqldump -u LOCAL_USER_NAME -p LOCAL_DATABASE | mysql -u REMOTE_USER_NAME -p -D REMOTE_DATABASE -h REMOTE_SERVER_ADDRESS “-p” specifies a password is needed Some lesser known commands that are useful: screen: This opens up a virtual console session that can be disconnected and reconnected from without stopping the session. This is great when connecting to console through SSH so you don’t lose your progress if disconnected. htop: An updated version of top, which is a process information viewer. iotop: A process I/O (input/output - hard drive access) information viewer. Requires Python ? 2.5 and I/O accounting support compiled into the Linux kernel. dig: Domain information retrieval. See “Diagnosing DNS Problems” Post for more information. More to come later... * Anything staring with “#!/bin/bash” is intended to be put into a script.
|
|