It duplicates memory because you want to use different programs, that is, different processes. And processes are isolated.
One possible solution would be having only one process handling the file object itself but different processes using this file data. For example, you could create some sort of service application to open this file for exclusive access. It can use some kind of inter-process messaging and enter the message loop to serve other applications and provide the required pieces of information contained in the file, on requests from other processes. Generally, such a design is quite easy to implement, but the complexity always depends on the semantic complexity of data and possible requests.
Another approach is migration from multiple processes to multiple threads within a single process, but you already have this suggestion, please see comments to your question. With multi-threading, you still can use the same kind of service implemented in one thread, but you can also use the shared data directly, say, read-only.
The particular forms of a service and messaging mechanisms depend on the platform and other factors, but they typically do exist everywhere.
I want to reiterate that my suggestion is not for the ultimate decision but is just an example. This is up to you what to choose.