Skip to content

SHA1 as default digest algorithm while running asymmetric performance test or 4006 test? why? #808

@kshitizvars

Description

@kshitizvars

Hi

While running some sign/verification test. I discovered that by default, we explicilty use SHA1 algorithm for creating digest of a message. For
reference, see below code, called by xtest --asym-perf -a ECDSA_SIGN -d 224 :

TEE_Result cmd_asym_prepare_hash(uint32_t param_types,
				 TEE_Param params[TEE_NUM_PARAMS])
{
	TEE_Result res = TEE_ERROR_GENERIC;
	TEE_OperationHandle hash_op = NULL;
	uint32_t hash_algo = 0;
	uint32_t exp_param_types = TEE_PARAM_TYPES(TEE_PARAM_TYPE_VALUE_INPUT,
						   TEE_PARAM_TYPE_MEMREF_INPUT,
						   TEE_PARAM_TYPE_MEMREF_INOUT,
						   TEE_PARAM_TYPE_NONE);


	if (param_types != exp_param_types)
		return TEE_ERROR_BAD_PARAMETERS;


	if (params[0].value.a == ALGO_ECDSA)
		hash_algo = TEE_ALG_SHA1;

In above code, we are using hash algo as TEE_ALG_SHA1, if we are getting ECDSA key, irrespective of size. Why we are doing this here? Any reason for the same?

Similarly, in 4006 test case, we are doing the same:-

if (tv->mode == TEE_MODE_VERIFY || tv->mode == TEE_MODE_SIGN) {

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions