Yes, there's one additional frame of input lag compared to a single chip. Now, depending on how you use your dual setup this could be everything from 0 to 1 frame of lag. If you use your dual setup to render at higher framerate, say 100 fps vs. 50 fps for a single chip, then the lag will be identical, assuming of course you have perfect scaling so you get double the performance. If you instead increase resolution so your framerate stays the same, there will be a lag.
Now to be fair, it's not a one frame lag versus no lag. There is always lag. Direct3D allows the CPU to be up to three frames ahead of the GPU. Even if the game reduces the lag to just one frame, there's also lag in the on average half frame from your input until the application reads its message queue, and then the time to render that frame, and then the time to transfer it to monitor until you see the change. Say you're rendering at 60fps and have 60Hz on your monitor, then you'd get on average 2.5 frame or about 40ms until the change has been fully updated on your screen. So instead of thinking it's 1 vs. 0 lag, think of it more like 3.5 vs. 2.5, or depending on how you see it perhaps 3 vs. 2.