A lean BIKE KEM design for ephemeral key agreement

By Nir Drucker, Shay Gueron, Dusan Kostic
2024
Download Copy BibTeX
Copy BibTeX
The QC-MDPC code-based KEM BIKE is an alternative candidate for standardization for the NIST Post-Quantum Cryptography Standardization Project. Per NIST’s report [2] “The BIKE cryptosystem was initially designed for ephemeral key use but has now been claimed to also support static key use”. BIKE uses the BGF decoder of [9] where its Decoding Failure Rate (DFR) is estimated by means of an extrapolation method. While this methodology provides a solid indication for a very small DFR, which is required for an IND-CCA claim, it may still be considered short of a proven upper bound, as stated in [2], “... and an upper bound on the decoding failure rate has yet to be found”. Nevertheless, the IND-CPA security of BIKE is established without a small DFR requirement on the decoder, and this property suffices for protocols that use ephemeral keys. This is the case for protocols that maintain the modern notion of forward secrecy (hence avoid static keys), where a prominent example is TLS 1.3.

This paper examines the communication bandwidth and the performance of a BIKE design that targets only the ephemeral key use cases, i.e., settles with IND-CPA security. We call this design “Lean-BIKE”. This study illustrates the incremental cost of the IND-CCA property. We argue that it would be useful to standardize two configurations of BIKE: a) “Lean-BIKE” that enjoys the reduced cost of an IND-CPA KEM, for the major class of forward secrecy supporting usages; b) BIKE whose IND-CCA security could be established by either a finer proof methodology for the BGF decoder or with another decoder that has a proven DFR upper bound.

Latest news

US, CA, Santa Clara
Amazon is looking for a passionate, talented, and inventive Applied Scientist with a strong machine learning background to help build industry-leading language technology. Our mission is to provide a delightful experience to Amazon’s customers by pushing the envelope in Natural Language Processing (NLP), Generative AI, Large Language Model (LLM), Natural Language Understanding (NLU), Machine Learning (ML), Retrieval-Augmented Generation, Responsible AI, Agent, Evaluation, and Model Adaptation. As part of our AI team in Amazon AWS, you will work alongside internationally recognized experts to develop novel algorithms and modeling techniques to advance the state-of-the-art in human language technology. Your work will directly impactRead more