mirror of https://github.com/THUDM/ChatGLM-6B
Update bug_report.md
parent
ca657e4080
commit
0e6c223007
|
@ -1,34 +1,32 @@
|
||||||
---
|
Issue tracker is **ONLY** used for reporting bugs.
|
||||||
name: Bug report
|
|
||||||
about: Create a report to help us improve
|
|
||||||
title: ''
|
|
||||||
labels: ''
|
|
||||||
assignees: ''
|
|
||||||
|
|
||||||
---
|
<!--- Provide a general summary of the issue in the Title above -->
|
||||||
|
|
||||||
**Describe the bug**
|
## Expected Behavior
|
||||||
A clear and concise description of what the bug is.
|
<!--- Tell us what should happen -->
|
||||||
|
|
||||||
**To Reproduce**
|
## Current Behavior (Screenshot)
|
||||||
Steps to reproduce the behavior:
|
<!--- Tell us what happens instead of the expected behavior -->
|
||||||
1. Clone '...'
|
|
||||||
2. Install '....'
|
|
||||||
3. Run command '....'
|
|
||||||
4. See error
|
|
||||||
|
|
||||||
**Expected behavior**
|
## Steps to Reproduce
|
||||||
A clear and concise description of what you expected to happen.
|
<!--- Provide a link to a live example, or an unambiguous set of steps to -->
|
||||||
|
<!--- reproduce this bug. Include code to reproduce, if relevant -->
|
||||||
|
1.
|
||||||
|
2.
|
||||||
|
3.
|
||||||
|
4.
|
||||||
|
|
||||||
**Screenshots**
|
## Environment
|
||||||
If applicable, add screenshots to help explain your problem.
|
<!--- OS: [Windows / MacOS / Linux] -->
|
||||||
|
<!--- Python version [e.g. 3.8] -->
|
||||||
|
<!--- transformers version [e.g. 4.23.1] -->
|
||||||
|
<!--- PyTorch Version [e.g. 1.12] -->
|
||||||
|
<!--- CUDA Support [The output of running `python -c "import torch; print(torch.cuda.is_available())"`] -->
|
||||||
|
|
||||||
**Environment (please complete the following information):**
|
<!--- Provide a general summary of the issue in the Title above -->
|
||||||
- OS: [Windows / MacOS / Linux]
|
|
||||||
- Python version [e.g. 3.8]
|
|
||||||
- transformers version [e.g. 4.23.1]
|
|
||||||
- PyTorch Version [e.g. 1.12]
|
|
||||||
- CUDA Support [The output of running `python -c "import torch; print(torch.cuda.is_available())"`]
|
|
||||||
|
|
||||||
**Additional context**
|
## Detailed Description
|
||||||
Add any other context about the problem here.
|
<!--- Provide a detailed description of the change or addition you are proposing -->
|
||||||
|
|
||||||
|
## Possible Implementation
|
||||||
|
<!--- Not obligatory, but suggest an idea for implementing addition or change -->
|
||||||
|
|
Loading…
Reference in New Issue