[PATCH xorg-gtest 1/4] xserver: increase timout to 3 seconds

Chase Douglas chase.douglas at ubuntu.com
Fri Oct 26 09:27:11 PDT 2012


On Thu, Oct 25, 2012 at 4:11 PM, Peter Hutterer
<peter.hutterer at who-t.net> wrote:
> A server without a config file that inits all input devices takes just over
> a second on my machine. Up this timeout so we don't get spurious signals
> later.
>
> Signed-off-by: Peter Hutterer <peter.hutterer at who-t.net>
> ---
>  src/xserver.cpp | 2 +-
>  1 file changed, 1 insertion(+), 1 deletion(-)
>
> diff --git a/src/xserver.cpp b/src/xserver.cpp
> index 29c0430..9b163bb 100644
> --- a/src/xserver.cpp
> +++ b/src/xserver.cpp
> @@ -419,7 +419,7 @@ void xorg::testing::XServer::Start(const std::string &program) {
>    std::string err_msg;
>
>    sigset_t sig_mask;
> -  struct timespec sig_timeout = {1, 0}; /* 1 sec + 0 nsec */
> +  struct timespec sig_timeout = {3, 0}; /* 3 sec + 0 nsec */
>
>    /* add SIGUSR1 to the signal mask */
>    sigemptyset(&sig_mask);

One might wonder if the test failing to start the server in 1 second
is a real failure :). I'm ok with this change, it just seems like it
shouldn't take that long...

For the series:

Reviewed-by: Chase Douglas <chase.douglas at ubuntu.com>


More information about the xorg-devel mailing list