-
Notifications
You must be signed in to change notification settings - Fork 214
Open
Description
Hi
While running some sign/verification test. I discovered that by default, we explicilty use SHA1 algorithm for creating digest of a message. For
reference, see below code, called by xtest --asym-perf -a ECDSA_SIGN -d 224 :
TEE_Result cmd_asym_prepare_hash(uint32_t param_types,
TEE_Param params[TEE_NUM_PARAMS])
{
TEE_Result res = TEE_ERROR_GENERIC;
TEE_OperationHandle hash_op = NULL;
uint32_t hash_algo = 0;
uint32_t exp_param_types = TEE_PARAM_TYPES(TEE_PARAM_TYPE_VALUE_INPUT,
TEE_PARAM_TYPE_MEMREF_INPUT,
TEE_PARAM_TYPE_MEMREF_INOUT,
TEE_PARAM_TYPE_NONE);
if (param_types != exp_param_types)
return TEE_ERROR_BAD_PARAMETERS;
if (params[0].value.a == ALGO_ECDSA)
hash_algo = TEE_ALG_SHA1;
In above code, we are using hash algo as TEE_ALG_SHA1, if we are getting ECDSA key, irrespective of size. Why we are doing this here? Any reason for the same?
Similarly, in 4006 test case, we are doing the same:-
optee_test/host/xtest/regression_4000.c
Line 4197 in 5ecc5d8
| if (tv->mode == TEE_MODE_VERIFY || tv->mode == TEE_MODE_SIGN) { |
Metadata
Metadata
Assignees
Labels
No labels