Page 1 of 1
Directed input and output
Posted: Mon Aug 18, 2003 9:05 pm
by oldefoxx
It strikes me odd that you have to use a file handle when you use the
EsamineDirectory() function, but none of the associated functions, such
as NextDirectoryEntry() make use of that handle. What is even more odd
is that you have to use a file handle for OpenFile() or ReadFile(), but you
do not use file handles when doing anything like ReadString() or WriteString(), which always default to the current file. How would you go about trying to take two input files and merge them together into one output file when you are this limited? I think the solution is to allow the current behaviour to continue, but to allow a file handle in those associated commands so that you can handle multiple files at the same time. This is a big limitation as it currently stands.
Re: Directed input and output
Posted: Tue Aug 19, 2003 2:56 am
by PB
> It strikes me odd that you have to use a file handle when you use the
> ExamineDirectory() function, but none of the associated functions, such
> as NextDirectoryEntry() make use of that handle.
ExamineDirectory uses an ID, which can be used later in your code with
the UseDirectory command to change directories in your apps. One such
example is when doing a recursive directory loop, in which case the ID is
needed to dig deeper into the directory tree.
> How would you go about trying to take two input files and merge them
> together into one output file when you are this limited?
You simply use the UseFile command to switch between the input files.
Posted: Tue Aug 19, 2003 12:29 pm
by Searhin
@ PB (UseFile)
true, but it would be much more simpler (and user-friendly) if you could specify the active file in Read/Write commands.
Now, it can get very confusing if you have to read from different files. Say, you have a file containing a large database and several files with pointers to data stored in the large file. Each time the user asks for some data, you would have to read pointers from the index files, seek the corresponding positions in the large file and then read data from the database. As the Read/Write commands have no parameters you can't be sure which File is the active one if you don't write an UseFile befor each other file operation.
So, our lives would be easier, i think, without having to change the active file with a UseFile(x) command each time.

Posted: Tue Aug 19, 2003 1:13 pm
by dmoc
I have to agree. I am in the process of extending a program and have already spent time tracking a bug which ended up being a missing UseFile(). This was when I was only handling two open files but now I need to extend it to four (or poss more). The problem with the current method is that while it provides a form of shorthand for simple cases, it makes it difficult to "modularise" more complicated programs. I would prefer explicitly specifying which file is being used in every file related command and avoid the potential for bugs. In the mean time this can be done by wrapping *all* the file commands in custom procedures.
NextDirectoryEntry()
Posted: Wed Aug 20, 2003 4:28 am
by oldefoxx
It strikes me that if you have to use a UseDirectory() or a UseFile() before using an associatged command, expecially if working from within a procedure, that the apparent advantage of relying on the current directory or current file is somewhat defeated. And when you have to use two statements to do the job of one, that is a net loas. I just think that using ReadString(1) is more advantageous than using UseFile(1):ReadString() when the situation warrants it. Another problem with trying to use the current directory or current file, is how can you tell which one is current? If I do a IseFile(1), how do I know what the current file was before that, so that I can put it back as the current file when I am through with my own immediate need?
There is a point I would like to make about reading directories. I have found that if you are in the process of reading the entries in a directory, then either create a new file or delete one in that directory, the only saft thing to do is to start all over again with another ExamineDirectory() effort. Under DOS or Windows, the proper sequence of entries is often broken by the process of adding or deleting an entry. The best way is to read all the entries in the directory and save them, then do whatever you have to do. Consequently, the idea of having more than one directory opened for examination tends to strike me as a bit odd, for the simple reason that I would normally read both into different arrays, sort them, then work from the array contents rather than from the incompletely read directories.
So for directories, I don't even envision a need to have a numbered reference for it. You open it, you read its contents. you go to the next
directory. That's it. To encourage other behavour could be irresponsible considering the fact that the thread for the NextDirectoryEntry() sequence is so fragile. In fact I would not defer reading a directory, as there are automated processes that occasionally create temporary or log files, which if they happen to be in the same directory as you are currently examining, could have the same potentially disruptive or disasterous effect.
Fact is, though I have never bothered to do it, I believe that you might have to read the same directory at least twice, comparing the entry results each time to ensure that the directory's contents did not change in the interum. If they did, just read it again until you get the same results twice in a row. Liker I said, I never bothered doing this because I never ran into an actual case where another application was accessing the same directory that I was, where it was either creating or deleting a directory entry at that point in time. So this likely overkill. But if your own application is creating or deleting a directory entry while you are in the process of reading it, the odds are that you will suddently get a return value of zero to the NextDirectoryEntry() because the thread being followed was disrupted.