### What this PR does / why we need it?
Fixed the bug of incorrect reshape usage.
For example:
ori_tensor: [[1, 2, 3], [4, 5, 6]]
after reshape:
[[1, 2], [3, 4], [5, 6]]
after permute:
[[1, 4], [2, 5], [3, 6]]
Now, we will directly use squeeze for a more intuitive understanding.
pr for main:
#7887
### Does this PR introduce _any_ user-facing change?
The actual peak-to-average ratio has successfully decreased.
Signed-off-by: shenchuxiaofugui <1311027364@qq.com>