Define $c_{n+1} = 1$ and $q_{n+1} = 0$. Observe that we can write the above system in the following $(n+2) \times (n + 2)$-matrix form: $K_{\lambda} \mathbf{q} = - \mathbf{c}$, where $K_{\lambda}$ is $\text{diag}(k_{i} + \lambda)$ for the upper $(n+1) \times (n+1)$-submatrix, ($0$ $1$ $1$ $\cdots$ $1$ $0$) for the bottom row and ($0$ $0$ $0$ $\cdots$ $0$ $1$)$^{\top}$ for the right most column. (By enlarging the matrix, we have encoded the constraint $\sum_{i = 1}^{n} q_{i} = 1$.)
Since $\lambda$ is unknown, we can proceed presupposing that $K_{\lambda}$ is not singular (i.e., $\det K_{\lambda} \neq 0$), and solve the constrained system by matrix inversion: $\mathbf{q} = - K_{\lambda}^{-1} \mathbf{c}$. The result is a vector $\mathbf{q}$ in terms of the unknown $\lambda$ and given constants. To find the allowed values for $\lambda$, numerically (or analytically) solve the polynomial $\mathbf{q} \cdot \mathbf{1} - 1 = 0$.