# Method and device for forming of initial value of pseudorandom number generator

FIELD: computer engineering; cryptographic systems.

SUBSTANCE: method is based on entropy valuation calculation and writing of mixed packed data into corresponding cells in different memory block areas. On the basis of written data new initial value is formed. Device for initial value of pseudorandom value generator forming contains data source analysis and current entropy valuation calculation means, data package means, data mix means, data accumulation and entropy valuation forming means, new initial value forming means.

EFFECT: method and device provide the capability of initial values forming, which provide dynamic source speed valuation, classification of sources by fast and slow, reliable and unreliable, and also forming of initial values taking into account speed characteristics of sources and reliability of these sources.

10 cl, 2 dwg

The invention relates to computing, and more particularly, to devices forming the starting value for a pseudorandom number generators in the absence of a hardware source of random data, and can be used in various cryptographic systems.

For many tasks it is necessary to form random numbers. So the random numbers are used in the simulation, in the formation of passwords, cryptographic keys for the various systems of cryptographic protection of information.

Known pseudorandom number generators (PRNG) to generate a pseudo-random sequence typically use some type of chaotic systems. The signals received from these systems, digitized and represented in binary form. The binary sequence is then hashed to form more than a short binary sequence, which is then used as a seed value for the PRNG. On the basis of the obtained starting values mentioned generator generates pseudorandom numbers, which are then used in cryptographic systems.

Current generators of pseudorandom numbers always use the sources of accidents. In this case, the experimentally estimated entropy, and the subsequent change in the program are based on the value of evaluation.

In particular, it is known technical solution is described in U.S. patent No. 5732138 [1], in which the device is to form a new starting value uses random data generated from a variety of chaotic systems. Besides the fact that this device has a low performance, it cannot be used in situations where the sources of random data have different rate of entropy generation, because the device does not perform the classification of sources for their reliability and speed. It is understood that to reliable sources include only those sources that cannot be controlled by the attacker, and the data from which may not be reproduced outside the system.

Closest to the claimed invention of the device is the device forming the starting values for the pseudo-random number generator described in U.S. patent No. 5778069 [2]. This unit also receives data from different types of sources, combines data from different sources, hashes the received data and based on the received data, generates a seed value for the PRNG.

However, this device cannot be used in situations where these sources have different rate of entropy generation, because this device does not perform dynamic evaluation of the velocities of the source is, that is necessary for definition of entropy. In addition, the known device [2] classifies sources according to their reliability.

The problem to which the invention is directed is to create a device for the formation of the starting values, which allows dynamic evaluation of the velocities of the sources, classification, sources at both slow and fast, reliable and unreliable, as well as forming the starting value, taking into account the speed characteristics of the sources and reliability of these sources.

The technical result is achieved due to the fact that the architecture of the device adapted to a new way of forming a seed value for the pseudorandom number generator in the absence of a hardware source of random data. When this new method is that for the formation of the starting values perform the following operations:

accumulate data from various external sources, representing random processes, in the memory block;

analyze the data to determine source type: fast or slow, reliable or unreliable;

for each source according to the data obtained from this source, calculate the current estimates of entropy;

carried out for each source, the accumulation of current estimates of entropy by sumira the project for these estimates to record them in the memory block, thus each cell of the memory block is used to record current estimates of entropy from only one data source.

perform compression of the data received from each source, using the hash function and then write the result of compression in another cell block of memory, such a memory cell is used for recording the compressed data from only one data source.

carry out the mixing of the compressed data using the hash function and writes the result of mixing in another cell of the memory block;

determine the minimum estimation of the entropy for every reliable source on the basis of the obtained current estimates of entropy;

carry out the calculation of estimates of entropy by dividing the minimum evaluation for two and store the sequence of the given values in the sequence of cells to another, for example, the second region of the memory block;

perform the accumulation of the given estimates of entropy for each trusted source by summing these estimates, and to record the amounts given estimates in the corresponding cells in the second region of the memory block, while each cell of the memory block is used to record the sum of the estimations of the entropy from only one data source.

for each trusted source of data write compressed data is another cell of the second memory area, in this case, each cell of the memory block used for recording the compressed data from only one data source.

for each trusted source of data write mixed data in the corresponding cell of the second region of the memory block;

for each trusted source data to determine a set of given estimates of the entropy stored in the sequence of cells of the second region of the memory block:

exercise calculate conservative estimates of entropy by dividing the two earlier estimates of the entropy stored in the respective cells of the second region of the memory block, which is also stored in the corresponding cell in the other, for example, the third region of the memory block;

carry out the accumulation conservative estimates of entropy for each trusted source by summing these estimates and records the amounts received conservative estimates in the sequence of cells of the third region of the memory block, while each cell of the memory block used for recording conservative estimates of entropy from only one data source.

for each trusted source data are recording the compressed data in the appropriate cell of the third region of the memory block, while each memory cell is used for recording the compressed data from only one data source.

for each nade is the main source of data write mixed data in the appropriate cell of the third region of the memory block;

verify the following conditions:

(a) the amount of current estimates of entropy must be equal to or exceed the number 128 to at least three sources, and one of the sources must be fast and reliable or slow and reliable,

b) the amount given estimates of entropy must be equal to or exceed the number 128 for each of the sources, all of these sources must be fast and reliable or slow and reliable,

amounts conservative estimates of entropy must be equal to or exceed the number 128 for each of the sources, all of these sources must be fast and reliable or slow and reliable,

with all these conditions form a new starting value for a pseudorandom sequence generator by additional mixing already mixed data stored in the respective cells of the first and second areas of the memory block, and the previous seed value of the pseudo-random sequence generator, and the formation is carried out by applying an iterative cyclical hashing the combined data extracted from the cells of the first, second areas of the memory block, and the previous starting value.

In the private version as external data sources using random processes Lieb is in the local network, type Ethernet, or wireless local area network (WLAN), with these random processes are the values of signal levels in a WLAN, the intervals between the incoming packets in the network, the intervals between packets leaving the network, the intervals between the interrupts, the interrupts.

In another private version mixed data stored in the corresponding cell of the third region of the memory block, used in the formation of a new starting value at the moment of switching on the external device comprises at least one of the networks.

In another private embodiment, the area of the memory block are segments of contiguous address space of a memory array with random access.

In a particular embodiment, to determine the entropy register the frequency of occurrence of random events in the data source for a fixed time interval and based on the obtained frequencies carry out the calculation of the current estimates of entropy by the following formula:

where p_{i}the probability calculated on the basis of frequency of occurrence of random events during a fixed time interval.

In a particular embodiment, each source mentioned belong to one of the following types: fast and reliable (FR), slow and reliable (SR), fast and unreliable (U), slow and unreliable (SU).

Based on the above new method, the forming device starting value for a pseudorandom number generator in the absence of a hardware source of random data provides a tool for the analysis of source data and calculation of current estimates of entropy, data compression feature, the blending tool data summing the estimates of entropy, a means of accumulating data and generate estimates of entropy, the tool will generate a new seed value and a means of distribution of start values of the pseudo-random sequence generator,

the tool for source analysis and calculation of current estimates of entropy is designed to retrieve data from different sources, each representing a different random processes, and identify the type of source based on the received data, and to calculate the current estimates of entropy for each source, the sources are divided into types, the total number of which is defined by various combinations of the following characteristics: fast/slow, reliable/unreliable;

the data compression feature is designed to compress data received from the sources, and outputting the compressed data to the device data mixing, which is designed for mixing the compressed data of each source;

the tool summero the project estimates of entropy is intended to summarize the obtained current estimates of entropy for each source;

the means of accumulation of data and generate estimates of entropy contains two tools for calculating estimates of entropy, two means of summation and memory, is divided into three areas, each of which consists of many cells.

first means for calculating estimates of entropy is designed to calculate the following estimates of entropy for each data source based on the current estimates of the entropy of this source obtained from the means for analyzing the sources and calculating the current estimates of entropy;

the first tool summation is designed to calculate the amounts given estimates of entropy for each source obtained by summing the above estimates of the entropy of the source;

second means for calculating estimates of entropy is to calculate conservative estimates of entropy for each trusted source based on the selected intermediate values of the numbers are estimates of the entropy obtained in the first means for calculating estimates of entropy;

the second tool summation is designed to calculate the amounts conservative estimates of entropy for each trusted source by summing conservative estimates of the entropy of this source is calculated by the second means for calculating estimates of entropy;

the memory block is designed for recording the results of the compression and is masiania, obtained from the means of mixing and compression, and estimates of entropy, while in the cell in the first region of the memory block to record the results of mixing for all sources in the cells of the second and third regions of the memory block records the result of the mixing of data from reliable sources, in addition, the first area of the memory block is used to record the current estimates of the entropy obtained in the analysis and calculation of current estimates, the second area of the memory block is used to write the given estimates of the entropy obtained in the first means for calculating estimates of entropy, the third region of the memory block used for recording conservative estimates of the entropy obtained in the second calculation means estimates of entropy;

the means of forming a new starting value is intended for the inspection of the following conditions are true:

(a) the sum of the current estimates of entropy in the first area of the memory block must be equal to or exceed the number 128 to at least three sources, one of the sources must be fast and reliable or slow and reliable;

b) the amount given estimates of entropy in the second region of the memory block must be equal to or exceed the number 128 for each of the sources, all sources must be fast and reliable or slow and reliable;

amount given estimates of entropy is the third area of the memory block must be equal to or exceed the number 128 for each of the sources, all sources must be fast and reliable or slow and reliable,

however with all the listed conditions specified device form a new starting values forms a new starting value for a pseudorandom sequence generator by additional mixing results mixing, stored in the corresponding cells of the first and second areas of the memory block, and the previous seed value of the pseudo-random sequence generator by applying an iterative cyclical hashing the resulting merged data;

the means of distribution of start values is intended to obtain a new starting values from the means for the formation of new seed value and transfer obtained a new starting value for the pseudo-random sequence generator.

In the private version as external data sources using random processes occurring in the local network, type Ethernet, wireless local area network (WLAN), with these random processes are the values of the signal levels in the WLAN, the intervals between the incoming packets in the network, the intervals between packets leaving the network, the intervals between the interrupts, the interrupts.

In another private embodiment, the means of forming new with the art values for the formation of a new starting values for mixed uses compressed data, stored in the corresponding cells of the third region of the memory block, in the moment of switching on an external device that is part of one of the local networks.

In a particular embodiment, the means for source analysis and calculation of current estimates of entropy provides a means of registering the frequency of random events designed to log random event sources within a certain time interval and means for computing the current estimates of entropy, designed to calculate the current estimates of entropy based on the frequency according to the following formula:

where p_{i}the probability calculated on the basis of frequency of occurrence of random events within a certain time interval.

In another private version of the analysis tool for source analysis and calculation of current estimates of entropy calculates the current estimates of entropy based on the frequency of random event sources registered by an external registration tool frequency of random events, the calculation is carried out according to the following formula:

where p_{i}the probability calculated on the basis of frequency of occurrence of random events within a certain time interval.

In a particular embodiment, the above estimate of the entropy calculated first the means for calculating estimates of entropy, equal to half the minimum for a certain sequence current estimates of entropy.

In another private variant conservative estimate of the entropy calculated by the second means for calculating estimates of entropy equal to half of the minimum for a certain sequence given estimates of entropy.

In a particular embodiment, each source belongs to one of the following types: fast and reliable (FR), slow and reliable (SR), fast and unreliable (FU), slow and unreliable (SU).

The claimed invention is illustrated by the following drawings: figure 1, which presents the algorithm, i.e. the sequence of operations of the proposed method; figure 2, which presents a structural diagram of the inventive device is generating the starting value.

As shown in figure 1, in accordance with the inventive method data 1 from different sources of data are subjected to analysis 2 to determine the source type.

The sources are divided into the following types: fast and slow, reliable and unreliable. It should be noted here that each source can be either fast and reliable (FR), or slow and reliable (SR), or fast and unreliable (FU), or slow and unreliable (SU).

To unreliable sources include processes that can be measured or get different persons who do not have access rights inside the disorder, for example hackers. For example, these individuals can measure time intervals between incoming packets or outgoing packets.

To reliable sources include sources that generates random data that cannot be reproduced outside the system. Thus, among the reliable sources are random processes, for example, in the printer.

After the analysis is complete 2 step 3 calculate the current estimates of entropy for each data source based on the frequency of random events mentioned sources and fill in the cells of the first region of the memory block accumulated estimates.

It should be noted here that the area of the memory block are segments of contiguous address space of a memory array with random access. Each cell in the first area of the memory block used for recording the mentioned estimates of entropy from only one source.

In addition, it is necessary to consider that the data stored in the first area of the memory block further used to update the starting value (seed) of the generator of pseudo-random number sequence (PRNG) to reduce the risk of compromise, and in the moment of switching on the external device comprises at least one of the networks. So, as such a device may be, for example, a printer or any the e other means.

It should be noted that current estimates of entropy for each data source calculated based on the frequency of random events in the sources.

Current estimates of entropy for each source is calculated by the Shannon formula:

where R_{i}the probability calculated on the basis of frequency of occurrence of random events within a certain time interval.

It is important to note that sources meet the following requirements 1) are quasi-stationary, i.e. the probability distribution of a random variable remains unchanged over a long period of time or very slowly changing; 2) sources are memoryless, i.e. the current value of the random variable does not depend on what this random variable has taken in the past; 3) statistically independent.

In step 4 perform the following operations:

1. Compression of the data obtained from these sources, with subsequent entry.

2. Mixing the resulting compressed data, then write the received mixed data in the appropriate cell of the memory block.

Data compression is performed with the use of the hash function.

For this purpose apply the hash function SHA-256. Description of the hash algorithm, computing a hash function for a given argument is, - appears in the Federal US FIPS PUB 180-2 [3]. However, the use of SHA-256 is not required. You can use, for example, the function GOST R 34.11-94 [4] or a similar hash function.

For SHA-256 as an argument to choose a random binary sequence, predominantly, a length less than 2^{64}binary digits. The calculated value is a binary sequence of length 256 bits. The basis of the hash algorithm is a function of compression.

The input binary sequence data from each data source is divided into blocks of 512 bits. Then the compression function is sequentially applied to each of the blocks. Therefore, if the length of the input sequence is less than 512 bits, then hashed using SHA-256 reduced to a single application of the compression function. The computational complexity of hashing minimum.

The resulting compressed values for each source data store to the corresponding cell of the first area of the memory block, while each memory cell is used to write data to only one source.

Mix all of the compressed data for each source is executed under the following conditions:

(1) the increment of the sum of the estimates of entropy must be at least 32;

(2) nacol is installed in the corresponding cells of the first region of the memory block, the data must contain at least 512-bits.

When these conditions are met, the mixing of the compressed data is as follows.

Form input binary sequence data.

To do this, perform the concatenation (combining) of compressed data all data sources and previous mixing;

then the received binary sequence of length 256(n+1) bits, where n is the number of sources of randomness, is divided into separate blocks of 256 bits.

Next, perform cyclic hashing using SHA-256. For example, for the first three blocks of cyclic hashing is as follows. Calculate the value of the hash function to the concatenation of the first and second blocks. The result of hashing overwrites the data in the first block. Calculate the value of the hash function to the concatenation of the second and third blocks. The result of hashing overwrites the data in the second block. And so on, for all blocks.

On the last n+1-th step, the value of the hash function to calculate the concatenation of (n+1)-th and the first blocks (use the value of the first block before starting the cyclic hash, and the hash overwrites the data in the original (n+1)-th block. This is one iteration of cyclic hashing.

Full mixing provide due to repeated application of the procedure cycle the definition of hashing to a sequence of blocks, a full update which occurs upon completion of the next iteration. The total number of iterations is equal to n. As a result, the sequence of (n+1)-th block is compressed to 256 binary bits one-time use SHA-256.

Received mixed retain data in the corresponding cells of the first region of the memory block.

The above procedure of mixing and compression provide:

1) bringing the samples received from various sources to a uniform size;

2) mixing of samples from the same and from different sources of randomness. Applying the hash function SHA-256 allows to achieve both objectives simultaneously.

Mixing by using a computationally time-consuming procedures cyclical hashing perform only when it has enough entropy, namely the increment counter entropy for each source must be at least 32. When mixing involved samples obtained from different sources of randomness.

Let us assume that in the case of applying the SHA-256 value changes one bit in the input sequence leads to the change of the values of a significant number of binary bits of the hash (up to 256). Then as a result of two iterations of cyclic hashing change is possible will happen in the 512-and binary digits. Thus, after n iterations, the values of all the output bits will depend on each of the values of the bits of the input sequence.

The known properties of cryptographic hash functions guarantee that the treatment will not cause loss of entropy. Thus, if we assume that takes into account all possible sources of randomness, the proposed solution provides the maximum rate of accumulation of entropy bits with a minimum amount of data stored.

After step 4 perform the filling of the second region of the memory block, i.e. step 5.

It should be noted here that the data stored in the second region of the memory block, is designed to automatically update the start value of the PRNG. It should also be noted that the second region of the memory block contains data obtained from a pessimistic approach to the evaluation of the entropy sources (assessment deliberately underestimated). This approach allows us to guarantee a high level of reliability. For example, if the data in the first area of the memory block obtained as a result of unrealistically high estimates of entropy, the automatic update using the second data region of the memory block can compensate for this disadvantage as follows.

For each trusted source (how fast that is and slow) based on the current estimates of the entropy of these sources, obtained in the way described above, calculate the minimal estimation of the entropy.

To do this, record a series, for example, of the eight intermediate scores, calculated as described above, according to the formula of Shannon. Then determine the minimum score in the series and carry out the calculation of the minimum valuation of entropy. This shows the minimum score equal to half the minimum assessment. We denote a series of interim assessments f_{i-7}f_{i-6},..., f_{i-1}f_{i}, where i - number series. Then the minimal estimate of the current value of the entropy given source:

s_{i}=min(f_{i-7}f_{i-6},..., f_{i-1}f_{i})·0.5 in.

The factor of 0.5 is introduced to compensate for the effect of the mutual influence of different sources. For example, the input packet triggers the interrupt and causes the output packet after a certain time interval. In some cases, such an event can be predicted with high probability.

The choice of a minimal mid-term evaluation is related to the possible non-stationary behavior of the source if the observation was made during a limited time interval, and to reduce the risk of underestimating the entropy as a result of targeted attacks the attacker.

Received minimal estimation of the entropy of each source record in the sledovatelnot cells of the second region of the memory block.

In the next step 6 perform the accumulation given estimates obtained by summing these estimates with subsequent results accumulation in the corresponding cells in the second region of the memory block, each cell record the sum of estimates from only one source.

Then in step 7 write the mixed compressed data obtained earlier in the mentioned way, the corresponding cells in the second region of the memory block. It should be noted here that in these cells mixed record data only from trusted sources. Write data to cells of the second region of the memory block perform as well as the entry in the first cell region of the memory block, and it is in every cell of the second region of the memory block write data only from one trusted source.

In step 8 perform the filling of the third region of the memory block.

It should be noted here that the data of the third region of the memory block is used exclusively to load the starting value of the PRNG in the moment of switching on an external device, part of the mentioned networks, such as a printer. Data to the third memory block is also formed on the basis of pessimistic estimates of the entropy of reliable sources.

For each trusted source data to determine a set of given estimates of the entropy stored in the posledovatelnosti cells of the second region of the memory block.

On the basis of the received set of estimates of the forming device shall calculate a conservative estimate of entropy. This calculation is similar to the previously mentioned calculation estimates of entropy, i.e. is the division provides estimates of entropy for two.

So, for example, denote a series of four minimum estimates of entropy for a given source as s_{i-3}, s_{i-2}, s_{i-1}, s_{i}, where i - number series. Then the second minimum estimate of the entropy:

p_{i}=min(s_{i-3}, s_{i-2}, s_{i-1}, s_{i})·0.5 in.

It is assumed that in the moment of switching on the third area of memory guaranteed to be filled, because the time interval between two inclusions of the printing device (enable-work-off-on) has the required duration, sufficient to fill.

It is also assumed that the sample consists of a set of samples, where the countdown understand binary sequence of some length.

The obtained calculated second minimum values of the estimated entropy is also stored in the third memory block.

Next, in step 9 perform the accumulation of the received conservative estimates for each reliable data source by summing these estimates, the entropy of each trusted source with subsequent recording obtained with the mm cells to the third memory block, in each cell, write the amount from a single source.

On stage 10 write mixed compressed data obtained previously mentioned method, in the appropriate cells of the third region of the memory block. The recording of these data is carried out in the same way as write data in the corresponding cells in the second region of the memory block, i.e. each cell write mixed data only from one trusted source.

Set the following rule for the distribution of values for the three memory areas:

FR source. For a sample of five times:

Two of the reference in the first area of the memory block;

Two of the countdown to the second area of the memory block;

One reference in the third region of the memory block.

SR-source. For a sample of seven counts:

Four count in the first area of the memory block;

Two of the countdown to the second area of the memory block;

One reference in the third region of the memory block.

FU-source. For a sample of five times:

Three reference in the first area of the memory block;

Two of the countdown to the second area of the memory block.

SU-source. For a sample of seven counts:

Five times in the first area of the memory block;

Two of the countdown to the second area of the memory block.

For example, if the system involved eight sources - four reliable unreliable and four, the first and second area of the memory block from which may contain the eight sub-fields each. The third area of the memory block contains four sub-regions of memory.

At step 11 verify the following conditions:

(a) the amount of current estimates of entropy must be equal to or exceed the number 128 to at least three sources, one of these sources must be fast and reliable or slow and reliable,

b) the above amounts are estimates of entropy must be equal to or exceed the number 128 for each source, all sources must be fast and reliable or slow and reliable,

C) the above amounts are conservative estimates of entropy must be equal to or exceed the number 128 for each source, all sources must be fast and reliable or slow and reliable.

In the case of compliance with all these conditions form a new starting value for a pseudorandom sequence generator (step 12). To do this, carry out the unification of the mixed compressed data stored in the respective cells of the first and second areas of the memory block, and the previous seed value of the pseudo-random sequence generator with an iterative cyclical hashing the resulting merged data extracted from the above-mentioned cells of the first and second areas of the memory block, and the previous starting value.

Here the need is IMO be noted, when generating a new seed value for the PRNG using the data stored in the third memory block, only in the moment of switching on an external device, such as a printer. This data is stored in the third memory block, together with data stored in first and second areas of the memory block.

Let us consider the stages of data aggregation and hashing received data.

As previously noted, in the first and second regions recorded mixed compressed data of the respective data sources. These data have 256 bits. Previous initial value of the PRNG has 256 bits.

Thus, the length of the binary sequence after the concatenation is equal to 768.

In the second stage, similar to the mixing procedure, perform an iterative cyclical hashing based on SHA-256.

For this purpose, the input of the hash function serves a binary block of 512-bits. Then the result of hashing replace the first 256 bits of the input block. Method of calculation is consistent hashing data input sequence with a fixed shift of 256 bits.

We denote the hash function H(·), the input unit;and two sub-blocks S_{i1}and S_{i2}In_{i}=S_{i1}||S_{i2}. Suppose that the input sequence is a length of 768 bits consists of a single block B_{
1}and one subblock S_{21}. Then the result of applying the hash function to the block B_{1}get the following result H(B_{1})||S_{12}||S_{21}. The next step applies the hash function to the block S_{12}||S_{21}. The result is equal to H(B_{1})||H(S_{12}||S_{21})||S_{21}. The final step applies the hash function to the concatenation of the first and third sub-blocks, with the result H(B_{1})||H(S_{12}||S_{21})||H(H(B_{1})||S21).

For a full update, you must perform three iterations. For example, after the second iteration will have the following result: H(H(B_{1})||H(S_{12}||S_{21}))||H(H(S_{12}||S_{21})||H(H(B_{1})||S_{21}))||H(H(B_{1})||S_{21})||H(H(B_{1})||H(S_{12}||S_{21}))).

In case of upgrade starting value when enabling the printing device to perform the hashing block of 512-bits.

You have received a new starting value serves as a starting value for a pseudorandom number generator, which on the basis of the new starting value generates a new pseudo-random numbers, which are then used in protection systems, for example, for formation of the password, Luca cryptography, etc.

The above method is implemented as a device generating a seed value for the pseudorandom sequence generator.

The structure of the device forming the parts shown in figure 2. The device comprises: a block 13, which is an instrument for analyzing the type of data source and calculation of current estimates of entropy, block 14, which represents a means of data compression, block 15, designed for mixing data, block 16, performing the summation of estimates of entropy, block 17, which means the accumulation of data and the calculation of estimates of entropy, block 18, which is a means of forming a new starting value, and the block 19, which is intended to issue the starting values of the pseudo-random sequence generator. Input device serves data from the data source.

It should be noted that these tools can have both hardware implementation and software implementation, the block 13, in the case of hardware implementation, is either the processor or any combinational circuit, for example, on the basis of elements AND, OR.

Block 13 in one embodiment, may contain a logger frequency of random events and a means of computing the current estimates of entropy. This vehicle registration and is intended for recording random events mentioned sources within a certain time interval, as a means of computing the current estimates of entropy is intended to implement the calculation of current estimates of entropy based on the received frequencies as indicated previously, the formula of Shannon.

However, there may be cases when the mentioned registration tool frequency of events is not part of the analysis tools (block 13) and is external to him.

Block 17 contains several functional elements (not shown in figure 2), which, in this example implementation are two means of calculating estimates of entropy, two means of summation and the memory block is divided into three areas, each of which, in turn, is divided into many cells. The number of cells in areas of the memory block depends on the number of sources.

The device operates as follows.

To the input unit 13 serves data from different data sources, which, as indicated previously, represent different stochastic processes. Based on the data received in block 13 analyze sources, and calculate the current estimates of entropy for each source by the claimed method.

To calculate the current estimates in block 13 using the data on the frequencies obtained from the reception frequency of random events. On the basis of received frequencies in the block 13 carry out the calculation of the current frequency.

Vechicle the major current estimates of entropy serves on the input unit 16, which on the basis of the obtained current estimates of entropy provides a summation of the received current estimates of entropy for each source.

The aggregate estimates from the output of block 16 is fed to the input unit 17, and performs the data acquisition and calculation of estimates of entropy.

Unit 17 operates as follows.

First means for calculating estimates of entropy, part of block 17, calculates estimates of entropy for each data source based on the received current estimates of specific entropy of the source data received from unit 13. The calculation of these estimates are described in detail in the disclosure of the way. The estimates obtained are transmitted to the first means summation.

This tool calculates the summation of the amount given estimates of entropy for each data source obtained by summing the above estimates of the entropy of a particular data source.

Second means for calculating estimates of entropy selects intermediate values of the numbers are estimates of entropy for each trusted source received in the first means for calculating estimates of entropy, and computes a conservative estimate of the entropy for each reliable data source based on the selected intermediate values. The algorithm for calculating these estimates is described earlier in the disclosure is the manual.

The second tool performs summation summation conservative estimates of entropy for each reliable data source by summing conservative estimates of the entropy of this source is calculated by the second calculator tool estimates.

Unit 14, which represents a compression tool that is connected in parallel to the block 13, and at its input serves data from a source. Receiving data from sources, the unit 14 performs the compression of these data, as has been described earlier in the disclosure. The resulting compressed data from the output unit 14 receives the input unit 15, which by means of mixing data. In block 15 is made by mixing the received data. The mixing procedure and the conditions of mixing are described earlier in the disclosure of the method.

The memory unit (not shown in figure 2), which is part of block 17, writes mixed compressed data received from the means mixing and recording estimates of the entropy obtained from the means of mixing and recording estimates of the entropy obtained from unit 13 and unit 16. In the first cell region of the memory block write mixed compressed data for all sources. In the cells of the second and third regions of the memory block write mixed compressed data from reliable sources.

Recording carried out in the same way as it was described at the races is the way freight, namely, in each cell record mixed compressed data from only one source.

In addition, the corresponding cells in the first region of the memory block write current estimates of the entropy obtained in block 13. The corresponding cell of the second memory block memory write estimates of the entropy obtained in the first means for calculating estimates of entropy. In the appropriate cell of the third region of the memory block write conservative estimates of the entropy obtained in the second means for calculating estimates of entropy.

The generated data in the block 17 and block 15 serves on the input unit 18, which is a means of forming a new starting value. This unit 18 checks the following conditions:

(a) the sum of the current estimates of entropy in the first area of the memory block must be equal to or exceed the number 128 to at least three sources, one of the sources must be fast and reliable or slow and reliable;

b) the amount given estimates of entropy in the second region of the memory block must be equal to or exceed the number 128 for each source, all sources must be fast and reliable or slow and reliable;

amount given estimates of entropy in the third region of the memory block must be equal to or greater than 128 for ka is an affected source, all sources must be fast and reliable or slow and reliable.

In the case of compliance with these conditions, the unit 18 generates a new starting value for a pseudorandom sequence generator by combining the mixed compressed data stored in the respective cells of the first and second areas of the memory block, and the previous seed value of the pseudo-random sequence generator by an iterative cyclical hashing received data and the previous seed value. The procedure of forming a new starting values described earlier in the disclosure of the method.

Forming a new starting value, unit 18 supplies the resulting value to the input unit 19, which represents a means for issuing a start value. Unit 19 transmits the received new starting value to the input of the pseudo-random number generator, which is based on the new starting value generates a new pseudo-random number based on which are formed passwords, cryptographic keys, etc.

1. The method of forming the starting value for a pseudorandom sequence generator in the absence of a hardware source of random data, comprising the following operations:

accumulate data from various external sources, representing the case of the haunted processes, in the memory block;

analyze the data to determine source type: fast or slow, reliable or unreliable;

for each source, calculate the current estimates of entropy according to the data obtained from this source;

carried out for each source, the accumulation of current estimates of entropy by summing up these estimates to record them in the memory block, while each cell of the memory block is used to record current estimates of entropy from only one source;

perform compression of the data received from each source, using the hash function and then write the result of compression in another cell block of memory, such a memory cell is used for recording the compressed data from only one source;

carry out the mixing of the compressed data using the hash function and writes the result of mixing in another cell of the memory block;

determine the minimum estimation of the entropy for every reliable source on the basis of the obtained current estimates of entropy;

carry out the calculation of estimates of entropy by dividing the minimum evaluation for two and store the sequence of the given values in the sequence of cells to another, for example, the second region of the memory block;

osushestvleniya given estimates of entropy for each trusted source by summing these estimates, and to record the amounts given estimates in the corresponding cells in the second region of the memory block, in this case, each cell of the memory block is used to record the sum of the estimations of the entropy from only one source;

for each trusted source of data write compressed data to another cell of the second memory area, while such a cell of the memory block used for recording the compressed data from only one source;

for each trusted source of data write mixed data in the corresponding cell of the second region of the memory block;

for each trusted source data to determine a set of given estimates of the entropy stored in the sequence of cells of the second region of the memory block,

exercise calculate conservative estimates of entropy by dividing the two previously given estimates of the entropy stored in the respective cells of the second region of the memory block, which is also stored in the corresponding cell in the other, for example, the third region of the memory block;

carry out the accumulation conservative estimates of entropy for each trusted source by summing these estimates and records the amounts received conservative estimates in the sequence of cells of the third region of the memory block, while each cell of the memory block used for recording conservative estimates of entropy from only one source;

for each trusted source data are recording the compressed data in the appropriate cell of the third region of the memory block, while each memory cell is used for recording the compressed data from only one source;

for each trusted source of data write mixed data in the appropriate cell of the third region of the memory block;

verify the following conditions:

(a) the amount of current estimates of entropy must be equal to or exceed the number 128 to at least three sources, one of the sources must be fast and reliable or slow and reliable,

b) the amount given estimates of entropy must be equal to or exceed the number 128 for each source, all sources must be fast and reliable or slow and reliable,

amounts conservative estimates of entropy must be equal to or exceed the number 128 for each source, all sources must be fast and reliable or slow and reliable,

with all these conditions form a new starting value for a pseudorandom sequence generator by additional mixing already mixed data stored in the respective cells of the first and second areas of the memory block, and the previous launches the importance of a pseudorandom sequence generator, this formation is carried out by applying an iterative cyclical hashing the combined data extracted from the cells of the first, second areas of the memory block, and the previous starting value.

2. The method according to claim 1, characterized in that it uses random processes or in the local network type Ethernet, or wireless local area network (WLAN), the data of these random processes choose the level values of signals in the network, the intervals between incoming packets on the network, the intervals between the packets coming from the network, the intervals between the interrupts, the interrupts.

3. The method according to claim 2, characterized in that the mixed data stored in the corresponding cell of the third region of the memory block, used in the formation of a new starting value at the moment of switching on the external device comprises at least one of the networks.

4. The method according to claim 3, characterized in that region of the memory block form so that they were segments of contiguous address space of a memory array with random access.

5. The method according to claim 4, characterized in that to determine the entropy register the frequency of occurrence of random events specific source within a fixed time interval and based on the obtained frequency done is make the calculation of the current estimates of entropy by the following formula:

where p_{i}the probability calculated on the basis of frequency of occurrence of random events during a fixed time interval.

6. The method according to claim 5, characterized in that each source belongs to one of the following types: fast and reliable (FR), slow and reliable (SR), fast and unreliable (FU), slow and unreliable (SU).

7. The device forming the starting value for a pseudorandom sequence generator in the absence of a hardware source of random data containing the analysis tool data source and calculation of current estimates of entropy, data compression feature, the blending tool data summing the estimates of entropy, a means of accumulating data and generate estimates of entropy, the tool will generate a new seed value and a means of distribution of start values of the pseudo-random sequence generator, the tool for source analysis and calculation of current estimates of entropy is used to retrieve data from different sources, each representing a different random processes, and identify the type of source based on the received data, and to calculate the current estimates of entropy for each of the above-mentioned source, these sources are divided into types, the total number of to the which is given various combinations of the following characteristics: fast/slow reliable/unreliable; output analysis tools connected to the input means of the summation current estimates of entropy, the output of which is connected to the first input means of the accumulation of data and the calculation of estimates of entropy, and the means of accumulation of data and the calculation of estimates of entropy contains more than one means of calculating estimates of entropy, more than one means of summation and the memory block is divided into three areas, each of which, in turn, is divided into many cells, in parallel with the analysis tool connected to the data compression feature, the input of which receives the data from the sources, with one of the outputs of the compact connected to the input means of mixing data, and the other output is connected with the second input means of the accumulation of data and the calculation of estimates of entropy; the output of the compression means connected to the third input means of the accumulation of data and the calculation of estimates of entropy and, in parallel, to the input means for the formation of a new starting value, to which is also connected to the output means of the accumulation of data and the calculation of entropy, the output means of the formation of a new starting value connected to the input means of the issuance of the starting value, the output of which new starting value is fed to the input of the pseudo-random number generator.

8. The device according to claim 7, characterized t is m, what to log and analysis tools to the input means of compression serves data from random processes, which are random processes in LAN type Ethernet or wireless local area network (WLAN), with these random processes are the values of the signal levels in the WLAN, the intervals between the incoming packets in the network, the intervals between packets leaving the network, the intervals between the interrupts, the interrupts.

9. The device according to claim 8, characterized in that the means of forming a new starting values for mixed uses compressed data stored in the respective cells of the third region of the memory block, in the moment of switching on an external device that is part of one of the local networks.

10. The device according to claim 9, characterized in that the means of source analysis and calculation of current estimates of entropy includes means for registering the frequency of random events, providing registration of a random event sources within a fixed time interval, and a means of computing the current estimates of entropy, which provides calculation of current estimates of entropy based on the frequency according to the following formula:

where p_{i}the probability calculated on the basis of frequency of occurrence of random events during the course the e fixed time interval.

11. The device according to claim 9, characterized in that the means of source analysis and calculation of current estimates of entropy calculates the current estimates of entropy based on the frequency of random event sources registered by an external registration tool frequency of random events, the calculation is carried out according to the following formula:

where R_{i}the probability calculated on the basis of frequency of occurrence of random events during a fixed time interval.

12. Device according to one of p and 11, characterized in that the above estimate of the entropy calculated by the first means for calculating estimates of entropy equal to half of the minimum for a certain sequence current estimates of entropy.

13. The device according to item 12, wherein the evaluation of the entropy calculated by the second means for calculating estimates of entropy equal to half of the minimum for a certain sequence given estimates of entropy.

14. The device according to item 13, wherein each source belongs to one of the following types: fast and reliable (FR), slow and reliable (SR), fast and unreliable (FU), slow and unreliable (SU).

**Same patents:**

FIELD: electric communications and computer engineering, in particular, methods and devices for cryptographic transformation of data.

SUBSTANCE: the essence of method is in generation of binary vector, appropriate for date and time of discontinuous message transfer, generation of binary vector of secret parameter, generator of binary identification vector and addition thereof to discontinuous message. Message is different from known methods because it includes additionally forming a random binary vector and binary vector of protection key, while binary vector of secret parameter is formed by double compressing of random binary vector, while binary identification vector is formed by transformation in circle of residue class by module p of binary vector, appropriate for data and time of transfer of discontinuous message and binary vector of secret parameter.

EFFECT: rejection of false messages, increased speed of process of confirming authenticity of discontinuous message.

1 dwg

FIELD: electric communications and computer engineering, in particular, methods and devices for cryptographic transformation of data.

SUBSTANCE: the essence of method is in generation of binary vector, appropriate for date and time of discontinuous message transfer, generation of binary vector of secret parameter, generator of binary identification vector and addition thereof to discontinuous message. Message is different from known methods because it includes additionally forming a random binary vector and binary vector of protection key, while binary vector of secret parameter is formed by double compressing of random binary vector, while binary identification vector is formed by transformation in circle of residue class by module p of binary vector, appropriate for data and time of transfer of discontinuous message and binary vector of secret parameter.

EFFECT: rejection of false messages, increased speed of process of confirming authenticity of discontinuous message.

1 dwg

FIELD: electric communications and computer engineering.

SUBSTANCE: the essence of stream data encryption method includes generation of encryption key in form of binary vector n bit long, generation of two or more pseudo-random series of symbols in form of binary vectors k bit long, division of data flow on block-symbols in form of binary vectors k bit long, transformation of block-symbols to encrypted message by using pseudo-random series of symbols and nonlinear cryptographic transformations and transmission over communication line, while one of pseudo-random series of symbols is made in form of binary vectors k bit longs by taking information from various bits of shift register, and another pseudo-random series is made in form of binary vectors k bit long by using "1" symbol in zero bit of binary vector, and for other bits - symbols, taken from k-1 various bits of register and are used for their transformation of operation of addition and multiplication of symbols in residue class ring by module p=2^{k}.

EFFECT: increased data encryption speed and expanded range of change of encryption's resistance to attacks on basis of known and selected original texts.

2 dwg

FIELD: engineering of systems for loading and reproducing protective unit of content.

SUBSTANCE: in accordance to invention, in receiving device 110 for protected preservation of unit 102 of content on carrier 111 of information unit 102 of content is stored in protected format and has associated license file, file 141 of license being encrypted with usage of open key, associated with a group of reproduction devices 120,121, and, thus, each reproduction device 121 in group can decrypt file 141 of license and reproduce unit 102 of content, and devices not belonging to group can not do that, while device 121 for reproduction may provide the open key, specific for given device, to system for controlling content distribution, and then system for controlling content distribution returns secret key for group, encrypted with open key of device 121 for reproduction, after that device 121 of reproduction by protected method receives secret key of group and may decrypt file 141 of license.

EFFECT: creation of system for loading and reproducing protected unit of content, making it possible to constantly control usage of unit of content.

3 cl, 4 dwg

FIELD: access to protected system restriction technics; avoidance of accidental persons access to system.

SUBSTANCE: fingerprint image is registered with following user personality identification. Some peculiarities of papillary pattern coordinates are determined and using difference of coordinates of peculiarities of received fingerprint image and stored in database positive or negative decision to grant access to system is made.

EFFECT: increased level of protection against access of accidental persons.

3 cl, 2 dwg

FIELD: mobile communications including mobile terminal control systems using digital signature.

SUBSTANCE: proposed system designed for controlling mobile terminal in compliance with information about mobile terminal condition has user-mounted server that functions to produce instruction message for respective mobile terminal in compliance with information about its condition, to add digital signature to compiled instruction message, and to transfer resultant message; mobile terminal functions to authenticate instruction message transferred from server and to execute power turn-off operations, as well as to mobile terminal input and output records in compliance with authenticated instruction message.

EFFECT: improved design of mobile terminal control system.

14 cl, 4 dwg

FIELD: computers.

SUBSTANCE: generator of random alphabet-numeric codes is installed on mail server. Generator generates random alphabet-numeric code, which is valid limited times for a limited time interval. Its graphical representation, called "electronic postage stamp", marks the outgoing mail, and recipient user's server check the compliance of the code in the mail to sender's address, recipient address, validity time and times of usage of "electronic postage stamp".

EFFECT: avoidance of automatic mass-delivery of unauthorized mails and virus distribution.

1 dwg

FIELD: unauthorized access protected development of executable program code for programmable portable information medium.

SUBSTANCE: initial program text is created on user's computer, transferred to information medium issuer's computer, where initial text is compiled and assembled; executable program code is created, which is enciphered and converted to transport code, which is downloaded to information medium through user's computer. At the same time during preliminary assembly information medium is equipped with instrumental program means for restoring executable program code from transport code, which is presented in intermediate format. Also system for distributed development of executed program for portable information medium, and information medium are disclosed.

EFFECT: increased data protection.

20 cl, 9 dwg

FIELD: cryptography technique; hidden storage and transfer of confidential information through open communication channels; marking of images containing large amounts of additional information.

SUBSTANCE: method for embedding additional information into digital images consists in substitution of separate bits in bytes of initial image. The remaining part of bits is used for correction of final digital image. The initial digital image is separated into bit layers. One of the obtained bit layers which is represented by bit sequence is chosen for writing additional information. Writing of additional information into obtained bit sequence is made using code. During writing of additional information using code bits in the obtained bit sequence, that are located on the margins of all changes of same bit sequences of zeroes and ones, are replaced in accordance to bits of written additional information.

EFFECT: increased volume of embedded information and ensured high tolerance of messages against some steganoanalysis methods.

2 dwg, 4 tbl

FIELD: mobile communication systems.

SUBSTANCE: proposed method for reallocating radio network server subsystem includes definition of radio link servo subsystem reallocation in network; transfer of radio resource control message corresponding to mentioned subsystem reallocation to terminal so as to enable controlled data exchange with terminal; and transfer of response radio resource control message corresponding to reallocation of radio network server subsystem to radio network controller that also receives radio resource control message.

EFFECT: enlarged functional capabilities.

70 cl, 12 dwg

FIELD: computer science, possible use in imitators of random processes, and also in specialized and universal computing machines.

SUBSTANCE: device has random number sensor, clock impulse generator, stepped voltage generator, comparison block, counter, decoder, trigger, impulse generator, memory blocks, delay elements, AND elements, multiplexer, adder, block for setting source data, block of adders, block of subtracters, block of amplitude discriminators, code-amplitude transformer, blocks of elements AND, elements OR.

EFFECT: expanded functional capabilities of device.

1 dwg

FIELD: computer science.

SUBSTANCE: generator has set-point generator 1, generator 2 of exponential voltage, generator 3 of evenly distributed random numbers, digital-analog converter 4, elements OR 5,6, block 7 for comparison, device for pulse generation 8, forbidding element 9, trigger 10, multiplication block 11, input 12 and output 13 of device. Requests stream is formed of elementary stream by excluding one request with preservation of second request, i.e. at output 13 of generator through temporal ranges, distributed in accordance to Erlang law of second order, pulses are generated, modeling receipt of requests.

EFFECT: decreased hardware costs.

1 dwg

FIELD: engineering of pseudo-noise series generators with arbitrary number of bits, while said number of bits is transferred in parallel manner during each clock pulse.

SUBSTANCE: beginning values of states are loaded in registers of parallel pseudo-noise generator, which immediately generates following n bits of pseudo-noise series, where n - arbitrary number, depending on required productiveness level. Then, first sub-portion of pseudo-noise generator in accordance to invention receives current state of pseudo-noise generator and outputs state of n bits pseudo-noise generator in the future.

EFFECT: increased speed of operation, realization of parallel processing for capturing and demodulating processes.

3 cl, 9 dwg

FIELD: computer science.

SUBSTANCE: device has random numbers source, N-digit selector-multiplexer, RAM, ranges control block, generations number control block, J-input OR element, AND elements block. Because series of given values of data set is broken in ranges and frequency of their appearance is set within certain limits, random series is generated with distribution law, presented in form of ranges.

EFFECT: broader functional capabilities.

3 cl, 7 dwg

FIELD: cryptography.

SUBSTANCE: method includes generating random numbers with use of displacement register with check connection, elementary digit of which is a q-based symbol (q=2^{l}, l - binary symbol length) at length of q-based digits register, in check connection networks nonlinear two-parameter operations on q-based symbols F (u_{b}, u_{d}) are used, on basis of random replacement tables, for generating next random number values z_{1}=F(u_{i}, u_{j}), z_{2}=F(u_{t}, u_{m}), z_{g}=F(z_{1}, z_{2}) are calculated, where u_{i}, u_{j}, u_{t}, u_{m} - values of filling of respective register digits, value of result in check connection networks z_{g} is recorded to g digit of displacement register and is a next result of random numbers generation, after which displacement of register contents for one q-based digit is performed.

EFFECT: higher speed and efficiency.

3 cl

FIELD: cryptography.

SUBSTANCE: method includes generating random numbers with use of displacement register with check connection, elementary digit of which is a q-based symbol (q=2^{l}, l - binary symbol length) at length of q-based digits register, in check connection networks nonlinear two-parameter operations on q-based symbols F (u_{b}, u_{d}) are used, on basis of random replacement tables, for generating next random number values z_{1}=F(u_{i}, u_{j}), z_{2}=F(u_{t}, u_{m}), z_{g}=F(z_{1}, z_{2}) are calculated, where u_{i}, u_{j}, u_{t}, u_{m} - values of filling of respective register digits, value of result in check connection networks z_{g} is recorded to g digit of displacement register and is a next result of random numbers generation, after which displacement of register contents for one q-based digit is performed.

EFFECT: higher speed and efficiency.

3 cl