Skip to content

ParallelTable with Sequencer does not seem to work #431

@vguptai

Description

@vguptai

Hi,

I am trying ParallelTable with Sequencer but not able to get desired result. I understand that I am not passing the input to forward in the correct way. How should I pass this data?

One way is to stack the input data together side by side before passing to the model and then split it inside the model. But I am not able to find the module which can do that. Basically, I would need to pass a tensor of size (50,16,1027) to the sequencer and inside the sequencer have a module which can split it back again.

Here is the code:

local a = torch.randn(50,16,1024)
local b = torch.randn(50,16,3)
local m1 = nn.Sequential()
local parallel_table = nn.ParallelTable()
parallel_table:add(nn.Linear(1024, 1024))
parallel_table:add(nn.Linear(3, 20))
m1:add(parallel_table)
m1 = nn.Sequencer(m1)
m1:forward({a,b})

Following is the error that I get:

In 1 module of nn.Sequential:
In 2 module of nn.ParallelTable:
/usr/local/torch/install/share/lua/5.1/nn/Linear.lua:66: size mismatch, m1: [16 x 1024], m2: [3 x 20] at /usr/local/torch/pkg/torch/lib/TH/generic/THTensorMath.c:1293

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions