Bash variable size is not fixed.It is very likely hold arbitrary amounts of data as long as malloc can find sufficient memory and contiguous address space.Let's assume you stored large large amount of data in your variable.When you try to write data to your file,possibly you will get error something like that
/bin/echo ${LARGE_DATA} >> ${YourFile}
/bin/echo: Argument list too long
This error related to max length of your command argument.
Please check Limits on size of arguments and environment section which stated in
execve man page http://man7.org/linux/man-pages/man2/execve.2.html
"...
the memory used to store the
environment and argument strings was limited to 32 pages (defined by
the kernel constant MAX_ARG_PAGES). On architectures with a 4-kB
page size, this yields a maximum size of 128 kB
...
"
EDIT:
Please also note that the above error for /bin/echo is just an example, it is possible to get a similar error, when you try other ways while writing a file.It is about argument size.
SUGGESTION:
If we think writing to file operations atomically, each time pipe is generated for writing, file descriptors are opened and closed.It takes a some time.Instead of using /bin/echo or others, you can write your own "WriteFile" program with higher level language like C/C++.What you need to is I/O redirection.
- Open file descriptor
- Write data
- Close file descriptor
- Optimize your code
Done
Please check System Calls like
ssize_t write(int fd, const void *buf, size_t count);
http://linux.die.net/man/2/write
sed
with a shell loop then that is the wrong way to go. it looks like you're callingsed
singly already for each iteration of a loop - that's where you get the loop counter value? for big jobs you should be using some stream capable tool - likesed
to tell the shell what to do - not vice versa. `<infile cmd | cmd | cmd | cmd >outfile'. 10 to 1 thr shell is the weakest link in your performance chain. – mikeserv Dec 10 '14 at 21:15nl
do thecall stored...
part with its-s
eparator string. It's not very clear what you're doing though - you provide no sample input or output. What doesdb2
do... why? – mikeserv Dec 10 '14 at 23:05db2
appears to be some sort of IBM database command line utility. It is highly unlikely thatnl
does anything close to what it does. – jw013 Dec 11 '14 at 00:53>>log_file.txt
outside the loop, so you havedone<infile >>log_file.txt
as the last line instead. That way you keep the file open and avoid reopening and reclosing it each iteration. If you want to try anything more complicated though, you should first take measurements to see if disk I/O is really taking enough time to be worth optimizing. You can probably do this by replacing>>log_file.txt
with>/dev/null
to get rid of disk I/O altogether and see how much of a speed-up you get. – jw013 Dec 11 '14 at 01:22