79742087

Date: 2025-08-21 09:25:38
Score: 1
Natty:
Report link
import torch.multiprocessing as mp
import torch

def foo(worker,tl):
    tl[worker] += (worker+1) * 1000

if __name__ == '__main__':
    mp.set_start_method('spawn')
    tl = [torch.randn(2,), torch.randn(3,)]

    # for t in tl:
    #     t.share_memory_()

    print("before mp: tl=")
    print(tl)

    p0 = mp.Process(target=foo, args=(0, tl))
    p1 = mp.Process(target=foo, args=(1, tl))
    p0.start()
    p1.start()
    p0.join()
    p1.join()

    print("after mp: tl=")
    print(tl)

# The running result:
# before mp: tl=
# [tensor([1.7138, 0.0069]), tensor([-0.6838,  2.7146,  0.2787])]
# after mp: tl=
# [tensor([1001.7137, 1000.0069]), tensor([1999.3162, 2002.7146, 2000.2787])

I have another question. As long as mp.set_start_method('spawn') is used, envn if I comment t.share_memory_,the tl is still modified.

Reasons:
  • Blacklisted phrase (1): another question
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Low reputation (1):
Posted by: damon tang