1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211
|
#define MICROSOFT_SPEECH
#if MICROSOFT_SPEECH
using Microsoft.Speech;
using Microsoft.Speech.Synthesis;
using Microsoft.Speech.Recognition;
using Microsoft.Speech.Recognition.SrgsGrammar;
#else
using System.Speech;
using System.Speech.Synthesis;
using System.Speech.Recognition;
using System.Speech.Recognition.SrgsGrammar;
#endif
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using System.Media;
using System.IO;
namespace RecoVocal
{
/// <summary>
/// Logique d'interaction pour MainWindow.xaml
/// </summary>
public partial class MainWindow : Window
{
SoundPlayer _snd = new SoundPlayer();
SpeechRecognitionEngine asr;
SpeechSynthesizer synth;
public MainWindow()
{
InitializeComponent();
synth = new SpeechSynthesizer();
asr = new SpeechRecognitionEngine();
asr.SetInputToDefaultAudioDevice();
asr.SpeechRecognized += new EventHandler<SpeechRecognizedEventArgs>(asr_SpeechRecognized);
asr.LoadGrammarCompleted += new EventHandler<LoadGrammarCompletedEventArgs>(asr_LoadGrammarCompleted);
asr.LoadGrammarAsync(new Grammar("../../Commandes.xml"));
asr.RecognizeAsync(RecognizeMode.Multiple);
}
void s_SpeakCompleted(object sender, SpeakCompletedEventArgs e)
{
if (e.Error != null)
{
MessageBox.Show(e.Error.Message);
return;
}
#if MICROSOFT_SPEECH
JouerSoundPlayer();
#endif
}
void SynthetiserAsync()
{
PromptBuilder builder = new PromptBuilder(new System.Globalization.CultureInfo("fr-fr"));
synth.SpeakCompleted += new EventHandler<SpeakCompletedEventArgs>(s_SpeakCompleted);
if (textBox1.Text == String.Empty)
{
builder.AppendText("Vous devez entrer un texte à synthétiser");
}
else
{
builder.ClearContent();
builder.AppendText(textBox1.Text);
}
#if MICROSOFT_SPEECH
ConfigureSoundPlayer(synth);
#endif
synth.SpeakAsync(builder);
}
void asr_SpeechRecognized(object sender, SpeechRecognizedEventArgs e)
{
String valeur = e.Result.Semantics.Value.ToString();
switch (valeur)
{
case "ARRETER":
synth.SpeakAsyncCancelAll();
break;
case "LECTURE":
SynthetiserAsync();
break;
case "PAUSE":
synth.Pause();
break;
case "REPRENDRE":
synth.Resume();
break;
case "QUITTER":
this.Close();
break;
default:
break;
}
}
void asr_LoadGrammarCompleted(object sender, LoadGrammarCompletedEventArgs e)
{
if (e.Error != null)
{
MessageBox.Show(e.Error.Message);
}
}
private void button1_Click(object sender, RoutedEventArgs e)
{
SpeechSynthesizer s = new SpeechSynthesizer();
var voix = s.GetInstalledVoices();
foreach (InstalledVoice v in voix)
{
listBox1.Items.Add(v.VoiceInfo.Name);
}
}
private void button2_Click(object sender, RoutedEventArgs e)
{
SpeechSynthesizer s = new SpeechSynthesizer();
String texte = textBox1.Text;
#if MICROSOFT_SPEECH
String voix = "Microsoft Server Speech Text to Speech Voice (fr-FR, Hortense)";
ConfigureSoundPlayer(s);
#else
String voix="ScanSoft Virginie_Dri40_16kHz";
#endif
s.SelectVoice(voix);
s.Speak(texte);
#if MICROSOFT_SPEECH
JouerSoundPlayer();
#endif
}
private void ConfigureSoundPlayer(SpeechSynthesizer s)
{
_snd.Stream = new MemoryStream();
s.SetOutputToWaveStream(_snd.Stream);
}
private void JouerSoundPlayer()
{
_snd.Stream.Position = 0;
_snd.Play();
}
private void button3_Click(object sender, RoutedEventArgs e)
{
SpeechSynthesizer s = new SpeechSynthesizer();
PromptBuilder builder = new PromptBuilder(new System.Globalization.CultureInfo("fr-fr"));
s.SpeakCompleted += new EventHandler<SpeakCompletedEventArgs>(s_SpeakCompleted);
builder.AppendText(textBox1.Text);
#if MICROSOFT_SPEECH
ConfigureSoundPlayer(s);
#endif
s.SpeakAsync(builder);
}
private void button4_Click(object sender, RoutedEventArgs e)
{
SpeechSynthesizer s = new SpeechSynthesizer();
PromptBuilder builder = new PromptBuilder(new System.Globalization.CultureInfo("fr-fr"));
s.SpeakCompleted += new EventHandler<SpeakCompletedEventArgs>(s_SpeakCompleted);
builder.AppendTextWithHint(textBox1.Text, SayAs.NumberOrdinal);
s.SpeakCompleted += new EventHandler<SpeakCompletedEventArgs>(s_SpeakCompleted);
#if MICROSOFT_SPEECH
ConfigureSoundPlayer(s);
#endif
s.SpeakAsync(builder);
}
private void button5_Click(object sender, RoutedEventArgs e)
{
SpeechSynthesizer s = new SpeechSynthesizer();
s.SpeakCompleted += new EventHandler<SpeakCompletedEventArgs>(s_SpeakCompleted);
FilePrompt prompt = new FilePrompt("../../FichierSSML.xml", SynthesisMediaType.Ssml);
#if MICROSOFT_SPEECH
ConfigureSoundPlayer(s);
#endif
s.SpeakAsync(prompt);
}
}
} |
Partager