diff options
author | Eric Biggers <ebiggers@google.com> | 2024-04-05 20:26:09 -0400 |
---|---|---|
committer | Herbert Xu <herbert@gondor.apana.org.au> | 2024-04-12 15:07:52 +0800 |
commit | 57ce8a4e162599cf9adafef1f29763160a8e5564 (patch) | |
tree | be792c96eea64499bc28643ae25998a23b9722c2 /arch/x86/crypto | |
parent | 4ad096cca942959871d8ff73826d30f81f856f6e (diff) |
crypto: x86/sha256-avx2 - add missing vzeroupper
Since sha256_transform_rorx() uses ymm registers, execute vzeroupper
before returning from it. This is necessary to avoid reducing the
performance of SSE code.
Fixes: d34a460092d8 ("crypto: sha256 - Optimized sha256 x86_64 routine using AVX2's RORX instructions")
Signed-off-by: Eric Biggers <ebiggers@google.com>
Acked-by: Tim Chen <tim.c.chen@linux.intel.com>
Signed-off-by: Herbert Xu <herbert@gondor.apana.org.au>
Diffstat (limited to 'arch/x86/crypto')
-rw-r--r-- | arch/x86/crypto/sha256-avx2-asm.S | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/arch/x86/crypto/sha256-avx2-asm.S b/arch/x86/crypto/sha256-avx2-asm.S index 9918212faf91..0ffb072be956 100644 --- a/arch/x86/crypto/sha256-avx2-asm.S +++ b/arch/x86/crypto/sha256-avx2-asm.S @@ -716,6 +716,7 @@ SYM_TYPED_FUNC_START(sha256_transform_rorx) popq %r13 popq %r12 popq %rbx + vzeroupper RET SYM_FUNC_END(sha256_transform_rorx) |